[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: How Big A Dump File Can Be Handled?

From: Ben Reser <ben_at_reser.org>
Date: Tue, 20 Aug 2013 19:11:31 -0700

On Tue Aug 20 16:44:08 2013, Geoff Field wrote:
> I've seen some quite large dump files already - one got up to about 28GB. The svnadmin 1.2.3 tool managed to cope with that quite successfully. Right now, our largest repository (some 19,000 revisions with many files, including installation packages) is dumping. In the 5300 range of revisions, the dump file has just passed 9GB.

Shouldn't be a problem within the limits of the OS and filesystem.
However, I'd say why are you bothering to produce dump files? Why not
simply pipe the output of your dump command to a load command, e.g.

svnadmin create newrepo
svnadmin dump --incremental oldrepo | svnadmin load newrepo

You'll need space for two repos but that should be less than the space
the dump file will take. I included the --incremental option above
because there's no reason to describe the full tree for every revision
when you're doing a dump/load cycle. You can save space with --deltas
if you really want the dump files, but at the cost of extra CPU time.
If you're just piping to load the CPU to calculate the delta isn't
worth it since you're not saving the dump file.
Received on 2013-08-21 04:12:10 CEST

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.