[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: removing a revision (Re: [PATCH] Best Practices)

From: Karl Fogel <kfogel_at_newton.ch.collab.net>
Date: 2002-08-09 17:23:35 CEST

Okay, thanks. This is broken, of course; memory use should be roughly
constant, with data arriving streamily. But I think it's a problem we
sort of knew about, although we don't have an issue filed for it.

Were you doing this over ra_local or ra_dav? Or did you try both?
Curious to know if it behaved the same...

-K

Michael Price <mprice@atl.lmco.com> writes:
> OK. Created a 2GB file containing random data. Created a new
> repository. Checked out the repository. Added the 2GB file. Did the
> commit. If you try this at home... have a lot of disk space because you'll
> need to store:
>
> 1. the original 2GB file
> 2. the 2GB temporary file in .svn/tmp
> 3. the portion of the 2GB file that gets added to the repo
> 4. the log files for the portion that gets added to the repo
>
> On my machine, each process has a hard limit of 448MB data segment size
> and 64MB stack size.
>
> The first time, everything was going along fine until I ran out of disk
> space :)
>
> The second time, after fixing the disk space issue, apache ran until it
> reached its process memory limit and then stopped. It had committed 448MB
> of the file when it stopped.
>
> So, don't commit a file larger than any memory limits on the server or it
> won't work.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Fri Aug 9 17:40:02 2002

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.