[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

file://.../ URLs + low resources

From: Alejandro Forero Cuervo <azul_at_freaks-unidos.net>
Date: 2006-11-10 18:36:58 CET

I have a server that has relatively low memory running multiple
Subversion clients (a program that calls libsvn_client). The clients
get executed from the same Apache process that serves access to the
repository the connect to. In order to avoid reaching the MaxClients
setting of Apache and having a deadlock (Apache runs a Subversion
client; it connects to Apache; Apache waits until one of its clients
terminates to server the client), I'm thinking of using file://.../
URLs for these clients rather than going through Apache.

Hence my question: is it reasonably safe to have multiple clients
access the repository directly? Remember this is on a machine with
little memory (and no swap), so often times they might not be able to
allocate enough memory and will have to terminate with an error (which
is fine). Or, to put it another way, has Subversion been designed
with care to prevent a repository from getting corrupted when
processes accessing it directly (through file://.../) URLs get
terminated at random points during their execution?

My guess is 'yes', but I just want to make sure. :-)


To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Fri Nov 10 22:03:57 2006

This is an archived mail posted to the Subversion Users mailing list.