[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: svn stability problems

From: <soloturn99_at_yahoo.com>
Date: 2003-01-22 18:30:18 CET

may i ask some additional questions?
- do you use it via file:// or http(s)://?
- do you use it with access control?
- what would svn ls -R https://.../ � wc give?
  i.e. is it more in the 3000 region, or more
  in the 20000 region?
- how big is a typical working copy
  ls -1R | wc
  (when do you start to think it is too slow)

btw, the svn ls -R https://.../svnrep seems
to be strange. i did a ^c after 5 minutes waiting ...
it seems to collect all the data in an interestingly
slow way and then send it all at once.


--- Gustavo Niemeyer niemeyer@conectiva.com wrote:
  brandon flatly says bdb4.0 is unusable for large repositories.
  i have a hard time believing that berkely db 4.0 is broken for
  large repositories, which would be 5000-10000 entries in our
  and 200mb in size of the dump file, and it can't handle a 100
  revision, 1000 file dump.
  but IF brandon and justin are right, than throwing out db4.0
  be made one of the highest priorities.
 We are currently using db 4.0.14 here, without any problems.
  what would be a good strategy to reach scalablility/stability?
  it be a testcase with thousands of entries and moving/copying
  in it? could it be encouraging somebody large to move to svn?
 This is our everyday usage of subversion. Our repository has
 7.7GB (no log files) and we are reaching revision 23000 (could
 please update svn-repositories.html?).
  and i'm wondering how the conectiva folks are handling this, or
 We haven't done any special setup for stability.
  many entries (files/dirs) they have in their repository ... as
 Ouch.. I'd have a hard time counting them, as there are lots of
 copies in the strucuture and many files are removed and replaced
 every day (see the documentation).
 To give you a vague idea, if you look at the documentation, you'll
 see that the snapshot directory hold one subdirectory for each
 actively in the distribution, and inside each of those directories
 there's something between 2 and 30 files, besides the copies
 each time a package is released to the snapshot distribution.
 [niemeyer@ibook ~]% svn ls $REPOS/snapshot | wc -l
 Of course, we should also count removed files, happening each time
 package has its version updated, and obsolete packages, which were
 removed from this directory.
  numbers given in
  are quite impressive and do not at all match our experiences.
 We have already surpassed them, as shown above.
 I'm surprised to see you reporting a bad experience, as we have a
 good experience with subversion/db stability. We had many problems
 our previous server (disk failure, out of memory, out of disk,
 problems, etc), and a simple db_recover always solved our problems.
 Gustavo Niemeyer
 [ 2AAC 7928 0FBF 0299 5EB5 60E2 2253 B29A 6664 3A0C ]

 To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
 For additional commands, e-mail: dev-help@subversion.tigris.org

Do you Yahoo!?
Yahoo! Mail Plus - Powerful. Affordable. Sign up now.

To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Sat Oct 14 02:06:28 2006

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.