[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: SVN Dump Question

From: B Smith-Mannschott <bsmith.occs_at_gmail.com>
Date: Tue, 16 Feb 2010 20:29:50 +0100

On Tue, Feb 16, 2010 at 18:39, Justin Connell
<justin.connell_at_propylon.com> wrote:

> The reason, I'm asking such strange questions is that I have a very abnormal
> situation on my hands here. I previously took a full dump of the repo (for
> the reason you implied) where the original size of the repo on disk was 150
> GB, and the resulting dump file ended up at 46 GB. This was quite unexpected
> (the dump is usually larger than the repos on smaller repos that I have
> worked on).

Such a large difference would make me suspicious, even if you're using
--deltas when generating the dump. Back of the envelope calculation
says that FSFS is wasting some space due to external fragmentation.
I'd estimate less than 10GB, so that's not enough to explain this
discrepancy. (Estimate: about 5.5 GB in storing revision properties
(assuming 4KB file system block sizes, 1.5M revisions, circa 140 bytes
actual revprop content per revisions) another 2.8 GB wasted on the
revs themselves, unless they've been packed, assuming the last file
system block of each file averages half full.)

> Just as a sanity check, this is what I was trying to accompliesh:
> Scenario - The repo needs to get trimmed down from 150 GB to a more
> maintainable size. We have a collection of users who access the repository
> as a document revision control system. Many files have been added and
> deleted over the past 2 years and all these transactions have caused such an
> astronomical growth in the physical size of the repo. My mission is to solve
> this issue preferably using subversion best practices. There are certain
> locations in the repo that do not have to retain version history and others
> that must retain their version history.
> Proposed solution -
> Take a full dump of the repo
> run a svnadmin dumpfilter including the paths that need to have version
> history preserved into a single filtered.dump file
> export the top revision of the files that do not require version history to
> be preserved
> create a new repo and load the filtered repo
> import the content form the svn export to complete the process
> Is this a sane approach to solving the problem? and what about the size
> difference between the dump file and the original repo - am I loosing
> revisions (the dump shows all revision numbers being written to the dump
> file and this looks correct).

Any files that were created outside of the repository portion being
selected by dumpfilter, but later copied into said portion will cause
you problems. I ran into that when trying to split up one of my larger
repositories. At the time, I gave up and moved onto more productive
endeavors.Looking back on it: Perhaps I could have made a series of
incremental dumps, the devisions being at the points in history where
the problematic renames took place. I then could have filtered each of
the dumps separately and then loaded them into a fresh repository.
sounds fiddly.

> Another aspect could also be that there are unused log files occupying disk
> space (we are not using Berkley DB though) is this a valid assumption to
> make when using the FS configuration.

FSFS does not write logs.

> Thanks so much to all who have responded to this mail, and all of you who
> take the time and read these messages
> Justin
Received on 2010-02-16 20:30:24 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.