[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: "svnadmin load" a huge file

From: Johan Corveleyn <jcorvel_at_gmail.com>
Date: Fri, 7 Jan 2011 21:03:32 +0100

On Fri, Jan 7, 2011 at 8:47 PM, Les Mikesell <lesmikesell_at_gmail.com> wrote:
> On 1/7/2011 1:31 PM, Victor Sudakov wrote:
>
>>>
>>> I don't think you are hitting some absolute limit in the software here,
>>> just running out of RAM on your particular machine.  Can you do the
>>> conversion on a machine with more RAM?
>>
>> I ran "svnadmin load" on a machine with 1 GB RAM and 25 GB swap (added
>> so much swap specially for the occasion). svnadmin crashed after
>> reaching the SIZE about 2.5 GB.
>>
>> Is 1 GB RAM and 25 GB swap not enough?
>
> If it is a 32bit OS, you'll most likely hit a per-process limit at 2 or 4
> gigs.  Or maybe some quota setting before that.

Like Stephen Connolly suggested a week ago: I think you should take a
look at svndumptool: http://svn.borg.ch/svndumptool/

I've never used it myself, but in the README.txt file, there is
mention of a subcommand "split":

[[[
Split
-----

Splits a dump file into multiple smaller dump files.

svndumptool.py split inputfile [startrev endrev filename]...

options:
  --version show program's version number and exit
  -h, --help show this help message and exit

Known bugs:
 * None
]]]

HTH

-- 
Johan
Received on 2011-01-07 21:04:28 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.