[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Is my implementation too large for SVN?

From: Russ <rsivak_at_istandfor.com>
Date: 2006-10-06 18:53:59 CEST

I am experiencing similar issues with 1.4 and have found better stability by using svnserve.

Sent wirelessly via BlackBerry from T-Mobile.

-----Original Message-----
From: "John Thile" <gilrain@gmail.com>
Date: Fri, 6 Oct 2006 09:50:01
Subject: Is my implementation too large for SVN?

Hello folks,

I recently completed a migration from VSS to SVN for my company, and,
obviously, both my users and myself were overjoyed to get away from
that mess. I've used SVN at several previous companies, so it was my
first choice. I'm worried now, however, that SVN may not be able to
handle my current situation. As my repositories have grown
exponentially, a number of performance issues have cropped up that I
did not expect.

First, my repositories are largely binary data -- we have some source
code and straight text files, but the vast majority are Word documents
and satellite bitstreams. The current size of my main repository is
about 7 GB, spread over 140,000 files in 60,000 directories. From what
I gather online, this may be approaching somewhat exceptional and
untried ground, for SVN. Please note that this repository is expected
to level out in the tens, or even hundreds, of GBs.

Our SVN 1.3.2 implementation is hosted on Apache 2.0.55, Windows
Server 2003, and a robust, dual-CPU HP server with 4 GB RAM. The
server and my users (about 150 SVN users) are on a fully gigabit LAN.

The main problem I'm seeing right now is that checkouts take an
extremely long time and always fail partway through, requiring a
process like this: checkout, fail; update, fail; update, fail; update,
complete. After that, one has the repository and further updates do
not fail. However, even then an update which winds up pulling down
only, say, 200 KB of changed data will take, on average, thirty
minutes. The error on new checkouts is below:

Error: REPORT request failed on '/<path edited>/!svn/vcc/default'
Error: REPORT of '/<path edited>/!svn/vcc/default': 200 OK (http://foo

Another problem is that I can no longer use dumps and loads now that
we have >2GB revisions in the repository. Is there any way around this

In general, my question is this: is there any way to improve SVN
performance in this situation? Is it likely to get much worse as the
repository grows, or should SVN scale up nicely? I'd appreciate any
comments or advice!


John Thile

To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Fri Oct 6 18:55:01 2006

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.