[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: fs dump/restore proposal

From: Greg Stein <gstein_at_lyra.org>
Date: 2002-04-24 02:33:28 CEST

On Tue, Apr 23, 2002 at 07:03:33PM -0400, Mark Benedetto King wrote:
> On Tue, Apr 23, 2002 at 04:43:08PM -0500, Ben Collins-Sussman wrote:
> >
> > For this reason, we're thinking some kind of simple binary format.
>
> I agree.
>
> Is it worth trying to add some amount of metadata to the
> format in an effort to provide for forward-compatibility?

Absolutely. I think all the items should be tagged in some way; I would
suggest RFC822-style headers. In fact, you could also use the "extra newline
separates the header from the body" thing, where the body stores the
fulltext. And note the length of that using a Content-Length header... :-)

Re: binary format. Branko suggested using base64. I'd suggest that we not
bother with the import/export overhead of that, and stick to a simple
length-defined binary format. An external tool can always convert to base64
if that is important. (or convert to XML or whatever...)

Re: avoid diffy. Yes. Fulltexts are nicer than diffs, and I like Zack's
suggest of using a compressor. But the compression should be relegated to an
external tool; I don't see a need for SVN to actually product compressed
data. It ought to be something like:

$ svnadmin dump ~/repos/test | gzip -9 > svn-repos-test.gz

Re: multiple files (e.g. one metadata plus one or more content). Bleck. That
makes it really hard to deal with the thing. For example, the above command
line just piped it all through gzip. If you had multiple files, then you
couldn't do that.

Cheers,
-g

-- 
Greg Stein, http://www.lyra.org/
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Wed Apr 24 02:34:12 2002

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.