[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Trying to restore a corrupted repo

From: Keith Johnson <k33f3r_at_gmail.com>
Date: Wed, 25 Feb 2015 13:14:56 -0600

On Wed, Feb 25, 2015 at 12:33 PM, Andreas Stieger <andreas.stieger_at_gmx.de>
wrote:

> Hi,
>
> Am 25. Februar 2015 18:48:37 MEZ, schrieb Keith Johnson <k33f3r_at_gmail.com
> >:
>
> >When having a 4th person do a checkout recently, the process (via
> >tortoisesvn) bombed out with a path to a revs file (db/revs/0/586 or
> >something) and input/output error.
> >
> >It became evident very quickly that this was a result of bad sectors,
> >and
> >maybe 6 total files were corrupt. I had backups for all but 1 of them
> >(r772). It later became evident that even my backup for one of them
> >(r390)
> >was corrupt. Copied everything to a new drive, and attempted to start
> >putting everything back together.
> >
> >The normal process for trying to salvage these situations is to dump
> >skipping over the bad revisions such as:
> >
> >svnadmin dump /svn -r 1:389 > dump_0001_0389
> >svnadmin dump /svn -r 391:771 --incremental > dump_0391_0771
> >svnadmin dump /svn -r 773:head --incremental > dump_0773_head
> >
> >The problem is that the 2nd command fails because 390 is fubar. (The
> >gist
> >is that I think 390 got truncated somehow because common error messages
> >are
> >thing like "lacks trailing newline" or "node id missing" - forgive me
> >I'm
> >not directly at the computer at the moment.) In all my searching and
> >reading the past few days, I've never really encountered anyone
> >complaining
> >that this process wouldn't work, I guess that's why I'm getting pretty
> >confounded.
> >
> >As further weirdness, if I leave out the --incremental flag, the dump
> >will
> >actually work (and produce a 64G or so file), and complain about r390
> >at
> >the very end. The problem as you might expect with this is that
> >svnadmin
> >load won't be able to load it because it wants to create everything
> >again
> >the first time it encounters it, and obviously that's useless (bombs
> >out
> >immediately that some item already exists).
> >
> >The original server in question was on ubuntu 12.04 which was running
> >1.6.17(? definitely some version of 1.6). New disk I made was with
> >14.04
> >which runs 1.8 something. The problems seem to happen with both
> >versions
> >of svnadmin.
> >
> >Also, please spare me the backup lecture; believe me, I know. I'm just
> >a
> >programmer trying to clean it up now.
> >
> >If anyone has seen anything like this before or has any suggestions for
> >getting around any of this, that would be great. I would love to be at
> >the
> >point where I could just get some valid dumps and then do what I can to
> >recreate the missing revs, but I can't even get past the dump stage
> >which
> >is exceedingly frustrating.
>
> Make a backup of all existing working copies including the pristine
> content cache under ".svn".
>
> When I last recovered a zeroed out block for someone I recreated the
> broken revison N by committing an indentical change into a repository with
> 1..N-1 loaded, with content from working copies and partial backups. The
> remaining incremental dumps then applied cleanly. The fixed rev file could
> be dropped back into the production area as it was.
>

Hi Andreas, thanks for the response.

The revision in question is over a year old, so I'm not sure I can put it
back exactly as it was (guess all I can do is try my best). I assume
there's no way to actually get historical data from pristine - that's just
a cache of current documents, correct?

Basically what you are saying to do is recreate up to the crash, try to
check in a close-as-possible replacement for r390, put that back in the
copy of the crashed repo, then dump further from there and import back in?
Sounds like a reasonable thing to try. Will report back later tonight.

keith

> Andreas
>
>
Received on 2015-02-25 20:15:24 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.