On Tue, Apr 23, 2002 at 11:12:03PM -0500, Kirby C. Bohling wrote:
>Karl Fogel wrote:
>>Greg Stein wrote:
>...
>>>Re: multiple files (e.g. one metadata plus one or more content). Bleck. That
>>>makes it really hard to deal with the thing. For example, the above command
>>>line just piped it all through gzip. If you had multiple files, then you
>>>couldn't do that.
>>
>> I don't see anything wrong with multiple files, as long as they are
>> packaged up in one blob for transport, and unpack to a single tree.
>> (At least, the objections above don't seem to apply to that kind of
>> tar file.)
Historically, multiple files for some target has always been a problem. I
recall back to my early Mac days, having to get special software to bundle
up the data and resource forks just so I could FTP something around. Today,
we're seeing similar problems with Mac bundles.
The simplest statement is that many tools work best with a single file
rather than groups of files.
Kirby demonstrates this extremely well with some of his examples:
> If you don't deal with multiple files please ensure
>
> svnadmin dump ~/repos/test | ssh mirror "svn restore ~/repos/test -"
>
> and
>
> svnadmin dump ~repos/test | split -b2000000 - "testDump." ;\
> cat testDump.* | svnadmin restore ~repos/test -
>
> work. Otherwise people with larger repositories will have problems on
> 32-bit filesystems. As a specific example Oracle's export/import tool
> *can't* read from stdin because of the file format I presume. I have
> run across other tools from time to time that don't deal well with this
> the 2GB limit.
Right. By enabling a single (stdout) stream, we can use split, we can pipe
it over the network, we can compress it, etc. There are a zillion things
that can be set up.
Multiple files? Nope. Gotta wait for the program to finish before you can go
in and see what files were dropped. Oh, and if some of those could be larger
than 2G, then I guess we'd need to add controls so that the file sizes could
be limited, so actually some of the files would be broken down into further
bits, and oh.. I don't have the disk, so I have to use compression, so I
guess we link in zlib and add an option to compress on writing, but wait, my
next program then needs to decompress, and hey... I didn't even want it on
the disk since I'm just spooling it off to my sync'd backup, but that
portion will have to wait for completion, and ...
I hope I've made my point :-)
> That might be a constraint on the style and managment of a single file
> format. Given svn does lots of stuff streamy, I would assume it would
> do this too. But just to put in the explicit request.
We can definitely be streamy. And note that a postprocess program could
easily consume the stream and break it down into multiple, structured files
should somebody desire that.
Cheers,
-g
--
Greg Stein, http://www.lyra.org/
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Wed Apr 24 08:50:33 2002