[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

404 error for each newly added file

From: <kmradke_at_rockwellcollins.com>
Date: Thu, 29 Oct 2009 11:36:42 -0500

In doing some performance tests, I noticed that each newly added file for
svn commit causes a separate HTTP request to the server which then returns
a 404 error page. For large numbers of files over a high latency
this is a significant amount of time. (Even worse if you have a large
custom 404 page defined)

Adding 1000 new files in a single commit over a connection with 500ms
will waste over 8 minutes in this step alone.

Is it by design that each newly added file in a commit requires the client
to perform a separate HTTP request? (neon does two PROPFINDs
per file, serf does one HEAD per file)

Both client and server are v1.6.5. Client was the windows distribution
from tigris. Server is self compiled on solaris 10 x86 using:

APR_VER := 1.3.8
APRUTIL_VER := 1.3.9
NEON_VER := 0.28.6
SERF_VER := 0.3.0

Would the new working copy stuff change this behavior, or is it something
required by webdav, or just never annoyed anyone enough to optimize it?

I'm willing to look into it more, if the behavior isn't expected to change
with the new working copy stuff...

Kevin R.

Received on 2009-10-29 17:37:02 CET

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.