[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: [Issue 1069] - Ran out of file descriptors while running svn merge

From: Michael Wood <mwood_at_its.uct.ac.za>
Date: 2003-01-08 11:25:19 CET

On Tue, Jan 07, 2003 at 10:10:26PM -0800, Brandon Ehle wrote:
> >
> >
> >I don't think we should fix issue 688 just for the HP shared memory thing.
> >
> >This bug seems quite different. Why the heck are we opening so many
> >repositories all at once? That definitely seems like a bug that can
> >and should be fixed. At least cuz it sounds like it isn't scaling
> >properly.
> >
> >
> Is there any possibility of me being able to up the maximum number of
> opened files so that I can finish my merge operation?
[snip]

Try "ulimit -n 8192" or something. Might not be possible to change on
your system.

Otherwise, for FreeBSD you can up the limit by changing a define in the
kernel config as far as I remember. Have a look in /sys/LINT. Unless
I'm remembering something else, of course...

On Linux you could try fiddling with /proc/sys/fs/file-{max,nr}

On HPUX, poke around in the kernel config using SAM and recompile the
kernel...

There might be a better way, but that's what comes to mind.

Otherwise, have a look at the {s,g}etrlimit man page(s) on your system.

-- 
Michael Wood <mwood@its.uct.ac.za>
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Wed Jan 8 11:26:18 2003

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.