The number seven I mentioned early refers to the number of branches created
during one day (of course there is an implicit commit but the only modified
file is the new branch parent directory).
In each branch the amount of files changed/created is small, ranging from a
dozen to maybe fifty (rarely). I have several small images but they are
modified rarely. The number of commits (on each branch) tends to be small,
in most cases one or two commits.
I'm aware of the performance issue for fetching an old revision in a
dev-line whith lots of revisions.CVS has the same problem. Currently I'm
using CVSNT and this issue is not a problem for us since almost all the
ongoing work is very close to the head.
----- Original Message -----
From: "C.A.T.Magic" <firstname.lastname@example.org>
To: "Ben Collins-Sussman" <email@example.com>; "Luiz Daniel Lapolla"
Sent: Friday, March 05, 2004 2:25 PM
Subject: Re: Branching models
> ----- Original Message -----
> From: "Ben Collins-Sussman" <firstname.lastname@example.org>
> > So make as many copies as you want. Is has nothing to do with
> > performance.
> I -think- it will depend on the amount of data changed
> on each revision, the size of the individual files
> and the size of the changesets.
> If the mentioned 15,000 files are purely text, html, .c
> (i.e. non-binary) files this is not a problem at all,
> since -real- changes on these files tend to be very small.
> I put about 60,000 .c and even many .png files into SVN
> and it still runs ok (up to ~ rev. 100 yet)
> but maybe I'll run a test on it overnight and push
> it up to some rev. 10000 to verify this :-)
> ( To give a worst-case example: if you'd put 15.000
> .png files into svn and RGB-color-correct all them 3 times
> a day and commit them, your DB would expand and slow
> down like hell -- binary files that change often should
> always be handled with care when using an SCM :-)
> ( note: the initial question was on 7 commits a day = 2555 a year )
> After reading through
> I -think- (devs, please tell me if I got it wrong)
> that access to >recent revisions< of the repository will
> remain quite fast, independent from the amount of revisions
> in the DB.
> but access to -very-old- file revisions (e.g. rev. 1) get slightly
> slower with each new revision that is committed, since all
> data is stored as 'diffs' to the previous file, and all the,
> say 2555 diffs have to be applied in reverse order.
> Applying 2555 diffs on, say, 1MB files could get pretty slow -
> and that slowness gets multiplied by 15.000 files in the worst case.
> But note, that other revision control systems work mostly similar,
> and suffer from equal or even larger limitations.
To unsubscribe, e-mail: email@example.com
For additional commands, e-mail: firstname.lastname@example.org
Received on Fri Mar 5 19:30:50 2004