Olivier Sannier wrote:
...
> The bad thing is that it's a hard drive killer, because source files are
> usually small and under the cluster size (FAT or NTFS filesystem, I'm
> under Windows).
...
> One big file with everything in it, instead
> of a whole lot of separate files. Basically, the compression would be
> "hooked" in the middle of the process, being somewhat transparent to the
> other APIs.
> Sure, there is an additional load and delay induced from this, but I
> think it would be hardly noticeable for an everyday use.
I believe efficient storage of data is the job of the file system,
not the job of each and every single application.
Reiserfs for example is doing that extremely well,
is a good file system like reiserfs not available under Windows?
Even if subversion would compress the "duplicate" copies in .svn/text-base,
you would still have the normal files in your working copy that should
cause the exact same problem under a poor file system, which again
indicates that the problem should be solved on file system level,
not on application level.
Also, a 160GB disk costs 62 Euros here in Europe. At that price, I find
it hard to justify the performance penalty just to save a few GB.
Carsten.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Wed Jun 8 21:47:16 2005