[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Large Repositories - related to Re: Issuezilla #1350 -- update

From: Russell Glaue <rglaue_at_cait.org>
Date: 2003-07-11 18:14:45 CEST

Some have suggested changing the client http-timeout value, which did
not solve the problem for us. Have you tried this?
Also, we are considering to modify the Berkley database settings to see
if this might alleviate the issue, have you attempted to do anything
like this?

Thank you for replying. It is good to know that others are experiencing
the same issues as us.

Evaluating your situation, I would now suspect that the http-timeout
errors we have been getting is not directly related to Server speed,
Server memory or Client http-timeout settings.
I wonder if there is one of three issues causing this:
        - Subversion has a bug handling the Client
        - Subversion has a bug handling large Berkley databases
        - The current settings for Berkley BD that come with Subversion cause
these issues for the BerkleyDB/Subversion server when the repository
gets over 200MB. Thus the Berkley DB settings may need to be
reconfigured for handling large Subversion repositories.


On Friday, Jul 11, 2003, at 10:45 America/Chicago, Matthieu Bentot

> From: "Matthieu Bentot" <matthieu.bentot@amdocs.com>
> Date: Fri Jul 11, 2003 10:45:45 America/Chicago
> To: "Russell Glaue" <rglaue@cait.org>
> Cc: <dev@subversion.tigris.org>
> Subject: RE: Large Repositories - related to Re: Issuezilla #1350 --
> update
> Hi there,
> I've been doing refactoring on a repository about that size and I
> experienced that problem.
> Typically some name change will trigger massive commits, which timeout.
> The commit does go in, tho, I just have to re-checkout.
> I attributed that problem to the fact that I host the repository on my
> own machine...
> The repository is 350 megs, about 4000 files. Subversion 0.24.2/Apache
> 2.0.45 on Windows 2k Pro SP1.
> Cheers,
> Matthieu
>> -----Original Message-----
>> From: Russell Glaue [mailto:rglaue@cait.org]
>> Sent: 11 July 2003 16:34
>> To: dev@subversion.tigris.org
>> Subject: Large Repositories - related to Re: Issuezilla #1350
>> -- update
>> This comment is related to the thread 'Issuezilla #1350 -- update'.
>> Has anyone had experience administering a large Subversion repository
>> of 300MB or larger?
>> We have several large repositories (in 0.24.2), one which we are
>> experiencing issues with which is 360MB of data.
>> When committing large numbers of files (like 150 files at a time) we
>> get http-timeout errors. He have increased the subversion client http
>> timeout from 60 to 120 but it did not resolve the issue.
>> To work around this issue and avoid the http-timeout error, the
>> developers make several smaller commits of about 25 to 50 files at a
>> time as opposed to the full 150 to 180 files the developer is working
>> with. This has been annoying for them as the web site they
>> are changing
>> content for is very large and one change can affect a minimum
>> of 50 to
>> 75 files at a time.
>> We have a fast (P3-1GHz/512MB) linux server which
>> subversion resides
>> on.
>> I would like to know if anyone else has had experience working with
>> large subversion repositories over 300MB and have experienced any
>> timeout issues with large commits.
>> -RG

To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Fri Jul 11 18:15:38 2003

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.