[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Problem with large files

From: Brandon Ehle <azverkan_at_yahoo.com>
Date: 2006-08-29 05:14:30 CEST

I have a Perl script I made to profile this problem when I submitted
this problem to the bug tracker a couple years ago.

http://subversion.tigris.org/issues/show_bug.cgi?id=913

It will generate you an asset repository that simulates an artist
working on textures and generates as many revisions as you want.

I'll try to dig it back up and send it to you.

Daniel Berlin wrote:
> On 8/28/06, Garrett Rooney <rooneg@electricjellyfish.net> wrote:
>> On 8/28/06, Ben Collins-Sussman <sussman@red-bean.com> wrote:
>> > I suspect the problem here isn't about working copy efficiency, it's
>> > the fact that we delta-encode every file that gets stuffed into the
>> > repository, even if it's something as simple as committing a file to a
>> > local file:/// repository. That takes a lonnnnnnnng time on huge
>> > binary files.
>>
>> That's why I was hoping Jeremy would hand some real world test cases
>> off to DannyB so he could make it Go Real Fast ;-)
>>
>
> I've emailed every person who, on users@ has complained in the thread
> about large file binary performance, and begged them to give me repos
> and files i can reproduce with, promising to fix their speed issues.
> I've even sent out the attached patch for testing
>
> I'm still waiting for an answer. :-(
>
> They seem to want solutions without having to test them.
>
> The last time someone had a significant binary performance problem
> with large files, I sent them the attached (which disables vdelta, and
> as such, is only really a good idea on svndiff1 using repos and
> networks with no 1.3 clients/servers).
> Basically, tell anyone who wants to try that they should take this
> patch and create a new repo with a patched subversion, and dump/load
> the old repo into the new one, and give checkouts/etc a try.
>
> The report from the one person who has ever tried it with large files
> was that it sped up commit times from 45 minutes to less than 5 ;)
>
>
> ------------------------------------------------------------------------
>
> Index: text_delta.c
> ===================================================================
> --- text_delta.c (revision 20792)
> +++ text_delta.c (working copy)
> @@ -148,7 +148,8 @@ compute_window(const char *data, apr_siz
> build_baton.new_data = svn_stringbuf_create("", pool);
>
> if (source_len == 0)
> - svn_txdelta__vdelta(&build_baton, data, source_len, target_len, pool);
> + svn_txdelta__insert_op(&build_baton, svn_txdelta_new, 0, source_len,
> + data, pool);
> else
> svn_txdelta__xdelta(&build_baton, data, source_len, target_len, pool);
>
>
>
> ------------------------------------------------------------------------
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
> For additional commands, e-mail: dev-help@subversion.tigris.org

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Tue Aug 29 05:15:23 2006

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.