When moving big repositories from Source Safe to CVS using a perl
script the CPU seemed to be a critical factor, as a matter o fact the
performance improved more than the cpu frequency when moving from a
Pentium something 350 MHz to a AMD 2400+ (2.0 GHz) The throughput was 40
to 50 times better (same hard drive, more memory (but the memory was not
used so much )).
Perhaps your conversion could be improved 10 times?
I'm willing to make some tests to improve the process, if you are
interested.
Best regards
Björn Carlsson <mailto:support@versionsupport.com>
VersionSupport.com ready to help you <http://versionsupport.com/>
Mike Crowe wrote:
>I have been trying to convert a sizeable CVS repository to subversion using
>the version of cvs2svn on the 1.0.x branch in r8550. I realise that this
>isn't really the latest version but the reason for that will become clearer
>later.
>
>I've looked through the issue tracker and couldn't see anything
>particularly appropriate.
>
>The CVS repository is five years old, just under 2GB in size, has 622
>tags/branches and around 50000 files. The repository is a mixture of files
>that have had only one or a handful of revisions and files that have had
>several hundred revisions.
>
>I've left cvs2svn running for 25 days on an otherwise idle 1GHz P3 with
>python taking around 98% CPU. It doesn't appear to have used much swap. In
>that time it has "added or changed" nearly 130000 revisions and appears to
>have processed around 36000 of the files in the repository.
>
>So,
>
>1. Is this the expected behaviour on a large repository with lots of tags and
>branches?
>
>2. Is there anything I can do to the repository to make it faster? Some of
>the tags could probably be removed for example.
>
>
>Obviously having to wait over a month for the repository to be converted
>would severely impede our ability to migrate to Subversion :(
>
>
>
Received on Sun Mar 7 13:39:38 2004