[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: bug, using latest version

From: Philip Martin <philip_at_codematters.co.uk>
Date: 2003-02-13 19:30:19 CET

Karl Fogel <kfogel@newton.ch.collab.net> writes:

> Magnus Persson <magnus.persson@mindark.com> writes:
> > 120 Mbytes of code+some binaries 5000 files in total. in a subdirectory.
[...]
> > ....................................................................svn: Bad
> > fil
> > e descriptor
> > svn: Couldn't get file-descriptor of tmpfile.
> > svn: Your commit message was left in a temporary file:
> > svn-commit.tmp
[...]
> Okay. Maybe we're opening tmp files but forgetting to close them or
> something, so we run out of file descriptors?
>
> Can you post a tarball or zip file of the tree you're trying to import
> somewhere, and file an issue giving the exact import command, the OS
> and versions of Subversion on both client and server? Thanks.

At 120MB please don't post it!

It appears that the number of open file descriptors is related to the
depth of the tree being imported/committed. I used the stress.pl Perl
script in tools/dev as follows

$ stress.pl -c -F0 -N100 -D1 -n0
$ svn export wcstress.XXXX foo
$ svn import file://`pwd`/repostress foo foo

This creates a repository called repostress and a working copy called
wcstress.XXXX where XXXX is some number. The repository has 100
nested directories, the number being controlled by the -N100 option.
During the import on my Linux box there are over 100 file descriptors
open simultaneously, with one more added for each additional directory
level. That's well under my ulimit setting of 1024, but perhaps
Windows has a lower limit?

-- 
Philip Martin
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Thu Feb 13 19:31:16 2003

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.