[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Best Practices: Performance on Large Repositories?

From: Steve Seremeth <subversion_at_seremeth.com>
Date: 2005-03-16 15:51:20 CET

Ben Collins-Sussman wrote:

> On Mar 16, 2005, at 4:16 AM, Gal Aviel wrote:
>> Sounds familiar.
>> Take a look at issue #2151.
>> If you have directories with many many entries, performance might
>> suffer, especially with BDB backend + Apache.
> This is entirely a working copy problem, it has nothing to do with the
> server.
> The simple fact is that it takes a very long time to stat() a working
> copy of 11,000 files, and doubly so on NTFS. This is what 'svn
> status', 'svn up', and 'svn commit' all do if you give them no arguments.
2 things:

1. Expected "ballpark" performance information would be very helpful in
the FAQ. Other SCM systems might not be any better on this front, but
if we had known upfront what this was going to be like we might have at
least organized our repository (and subsequently, our build scripts)

2. Is reorganizing the repository so we can be working on smaller
pieces the only answer to this problem? Is there a size of repository
that is just _not_ recommended for use with Subversion?

Thanks -


To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Wed Mar 16 15:54:00 2005

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.