On Fri, May 4, 2012 at 5:16 PM, C. Michael Pilato <cmpilato_at_collab.net> wrote:
> On 05/04/2012 04:37 PM, Mark Phippard wrote:
>> $ svn up --set-depth=infinity ev2-export/
>> Updating 'ev2-export':
>> subversion/libsvn_subr/sqlite.c:585: (apr_err=200030)
>> svn: E200030: sqlite: unable to open database file
>> This could be due to my SQLite version? I was getting similar errors
>> with my own repository but after it installed a few of the files.
> I can't explain this one.
Likely there were too many open files, so the database couldn't be
opened. Simple timing on what file was trying to be opened.
>> $ svn up --set-depth=infinity .
>> Updating '.':
>> A NOTICE
>> ... (just a few files snipped)
>> A notes/obliterate/presentations/why.odp
>> subversion/svn/update-cmd.c:163: (apr_err=24)
>> subversion/libsvn_client/update.c:611: (apr_err=24)
>> subversion/libsvn_client/update.c:552: (apr_err=24)
>> subversion/libsvn_client/update.c:413: (apr_err=24)
>> subversion/libsvn_wc/adm_crawler.c:858: (apr_err=24)
>> subversion/libsvn_ra_serf/update.c:2639: (apr_err=24)
>> subversion/libsvn_ra_serf/util.c:1837: (apr_err=24)
>> subversion/libsvn_ra_serf/util.c:1818: (apr_err=24)
>> subversion/libsvn_ra_serf/update.c:1281: (apr_err=24)
>> subversion/libsvn_wc/update_editor.c:3688: (apr_err=24)
>> subversion/libsvn_wc/adm_files.c:342: (apr_err=24)
>> subversion/libsvn_subr/stream.c:888: (apr_err=24)
>> subversion/libsvn_subr/io.c:4320: (apr_err=24)
>> subversion/libsvn_subr/io.c:4135: (apr_err=24)
>> svn: E000024: Can't create temporary file from template
>> Too many open files
>> So this gets further before getting a different error.
> I *might* be able to explain at least a portion of this one.
> My code calls the (new) get_wc_contents RA callback to get a read stream for
> the pristine text matching the specified SHA1. This will open the pristine
> file. The file remains open until its contents are consumed (whenever
> Serf/ra_serf get around to doing so per the pipelined approach taken to
> GET/HEAD/PROPFINDs during updates). I suppose if the code which is cramming
> pipelined requests (the result of parsing the initial REPORT) into the queue
> outpaces the code that's handling those requests, you could wind up with too
> many open files.
That's exactly what is happening. See my review your of r1333936
commit, from about an hour ago. I also suggest the fix, and copied
Mark on that email.
ra_serf will only allow 1000 outstanding requests during update
processing, but on Mac OS, the default file handle limit is 256.
That's why it probably worked on CMike's machine, but not on a Mac.
Received on 2012-05-05 00:43:53 CEST