Re: problems when using subversion over http with large files
On 7/27/06, Martin PovolnĂ˝ <email@example.com> wrote:
> we are using subversion on a couple of quite large repozitories.
> In our setup we have apache2 with ldap authentication and dav_svn.
> Until recently we where running subversion 1.1.4 and now for a while
> we run 1.2 with similar results.
> In one of the repositories we have cca. 4-6 GB of data with files of
> size up to 80 GB.
> With this repository we have problems. Clients connecting to the
> repository hang. In the apache error log we get something like:
> [Thu Jul 27 09:26:00 2006] [notice] child pid 12349 exit signal
> Segmentation fault (11)
> [Thu Jul 27 09:34:15 2006] [error] [client 10.2.0.8] Provider
> encountered an error while streaming a REPORT response. [500, #0]
> [Thu Jul 27 09:34:15 2006] [error] [client 10.2.0.8] A failure
> occurred while driving the update report editor [500, #190004]
> When this happends, we do 'svnadmin recover' and it temporarily fixes
> the problem.
> Our clients are recent versions of tortoise svn for windows, but it
> seems like the client version doesn't matter.
> To me it seems, that the problem occurs more frequently since we have
> large files in the repository.
> I wonder if other people have this type of data managed by SVN and if
> they have similar problems.
> Should I try different setup? Standalone subversion server (no
> apache), or would version 1.3 help?
> We are using berkeley backend, would fsfs be better?
You should probably tweak your DB_CONFIG settings to allow for more
locks and larger buffers. The defaults are quite small. I seem to
recall there was advice in the FAQ or in the mailing list archives
somewhere... (Maybe google helps)
Received on Fri Jul 28 22:34:30 2006
This is an archived mail posted to the Subversion Users