On Thu, Aug 29, 2019 at 4:21 PM Nathan Hartman <hartman.nathan_at_gmail.com>
> On Thu, Aug 29, 2019 at 3:15 PM Michael Ditum <mike_at_mikeditum.co.uk>
>> Thanks for the response, I hadn't tried it! I've just given that a go and
>> unfortunately the dump command failed with...
>> [mike_at_tigger svn]$ svnadmin dump svnroot > svnroot.dump
>> * Dumped revision 1.
>> * Dumped revision 24950.
>> * Dumped revision 24951.
>> * Dumped revision 24952.
>> svnadmin: E140001: zlib (uncompress): corrupt data: Decompression of
>> svndiff data failed
> So it fails on the same revision.
> I need to think about this some more.
Here's an idea, but with the caveat that I never tried this myself, so I
don't know whether it works or how well. A google search turns up a script
It looks like this might be the original author, last updated 7 years ago:
And it looks like this might be a newer version, forked by a different
author and with the last updates dated 2016:
The idea is to automatically check out each revision from the old
repository in sequence and commit it to a new repository, with the added
twist of skipping the revisions that fail to check out properly because of
the decompression error. Perhaps there's a way to commit a "placeholder" to
the new repository in those cases, so that all of your revision numbers
will remain identical after migration to the new server.
Assuming this works -- again, I've never done this! -- I like this idea
because it avoids doing delicate surgery on dumpfiles and things like that,
and because you would retain history and not lose information, even if you
might lose the revisions that you've never been able to checkout anyway.
One issue I see is that the newer "tonyduckles" version says it requires
minimum Subversion 1.6.
Maybe try to contact the author(s) of svn2svn and ask some questions?
Received on 2019-08-29 22:37:34 CEST