[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: New file svn-restore-dumps.py

From: Martin Furter <mf_at_rola.ch>
Date: Wed, 11 Mar 2009 17:04:34 +0100 (CET)

On Wed, 11 Mar 2009, Mark Stead wrote:

> 2009/3/10 Arfrever Frehtes Taifersar Arahesis <arfrever.fta_at_gmail.com>
>
>> - print "Ignoring dump file '%s' - this is for another
>> repos" % filename
>> + print("Ignoring dump file '%s' - this is for another
>> repos" % filename)
>>
>
> Yes, best to make it consistent.

Nice tool, though it has a few rough edges.

I didn't want to wait a night just to test it, so I replaced the
"svnadmin load repospath" by "wc -l". But that lead to the following
error on solaris:

IOError: [Errno 11] Resource temporarily unavailable

So I just commented out the whole command to find the following:

1) It is a bit too verbose, I got tons of "Ignoring dump file..."
messages, especially many for other repositories. Would be nice if those
aren't printed, except maybe when a new option '-v' is specified.

2) I got an endless loop because nothing is loaded into the repository and
then the HEAD revision does not increase.
The script scans the directory over and over until it doesn't find any
matching dumpfiles anymore. It would be better to scan the directory once
and store the list of dumps found in a list. Also the HEAD revision of the
repository can be read only once at the start of the script, since if
svnadmin fails the script will detect it, and if the dumps are labeled
wrong bad things happen anyway.
So I propose to change it this way:
  - Read HEAD revision of the repos.
  - Scan for dump files to load and put them into a list.
  - Load all dump files.

Other than that it looks good. I like the script.

Martin

------------------------------------------------------
http://subversion.tigris.org/ds/viewMessage.do?dsForumId=462&dsMessageId=1308451
Received on 2009-03-11 17:05:00 CET

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.