[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Problem with large commits

From: Tim T. <tim.timmerman_at_gmail.com>
Date: 2007-04-23 07:29:01 CEST

Hi,

  Committing large files to Subversion seems to take forever.

  I handle the infrastructure for a project which needs to archive
some 10-20 XML
 files of about 6-7 Megabytes each. These datafiles are an essential
part of the
 delivery and must be archived.

 The last update to these datafiles took on the order of two hours,
even though I tried
 to be smart and deleted them first, checking the files in as though
they were new,
 and thus elliminating any line-by-line diffs.

 Checking the project server ( A Sun Blade 1500, with 1 Gig of
memory), I noticed
 that the post-commit script was eating 100 % CPU and 100% memory during that
 time. I'm using the standard perlscript by the way, no local mods.

 Since i expect to be replacing/updating these files some more; is
there anything
 that I can do to speed up the commits ?

 TimT.

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Mon Apr 23 07:29:22 2007

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.