[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: best way to update large blobs in svn

From: Karl Fogel <kfogel_at_red-bean.com>
Date: 2007-07-07 01:18:42 CEST

"Roy Franz" <roy.lists@gmail.com> writes:
> Below is a thread that examines some of the issues with storing large
> binary files:
>
> http://svn.haxx.se/users/archive-2007-03/1165.shtml
>
>
>>From the testing I did (with Linux kernel tarball IIRC) subversion did
> a pretty good job in many cases, although not as good as the xdelta
> program could do.
>
> As Phil mentioned, if you need to compress them, then using the
> --rsyncable flag is important - a small uncompressed change can result
> in a large change in the compressed output, which will defeat
> subversion's delta algorithm. I did not try bzip2, but I would guess
> that you would not get any addition compression from subversion. (the
> repo size would grow by the size of each .bz2 file checked in.)
>
> The conclusion that I came to was:
>
> I also tried a few experiments with gzipped files, and found that gzip
> with the --rsyncable flag did better than uncompressed by a small
> margin, and that normal gzipping did much worse.
>
> [...]

Did you accidentally leave out your conclusion? It looks like that
may have happened...

-Karl

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Sat Jul 7 01:18:34 2007

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.