[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Problem with large files

From: Daniel Berlin <dberlin_at_dberlin.org>
Date: 2006-08-28 16:41:51 CEST

On 8/28/06, Ben Collins-Sussman <sussman@red-bean.com> wrote:
> On 8/28/06, Daniel Berlin <dberlin@dberlin.org> wrote:
>
> > The report from the one person who has ever tried it with large files
> > was that it sped up commit times from 45 minutes to less than 5 ;)
>
> I don't think that this is rocket science which requires testing. :-)
> Of course, if you just insert new data directly into the stream
> without trying to deltify it, it's gonna be way way faster.

Uh, no, you don't understand.
The patch only changes the behavior when deltaing against the empty
stream (ie for the first file revision).

Right now, we do that using vdelta, but only because it does target
side deltas, and generates something roughly comparable to zlib, but
using about 10x more cpu time.

With the patch, svndiff1 using clients will still compress the file,
but only using zlib.

It turns out that this actually gets better results size wise than
zlib was anyway.

The patch does *not* change things so that the initial revision is
uncompressed in any way, it just lets svndiff1 compression (zlib) do
it instead of the much more expensive vdelta.

HTH,
Dan

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Mon Aug 28 16:55:01 2006

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.