On Fri, 2011-05-20, Winston Smith wrote:
> Sorry if this is the wrong list, but I'm curious about one thing:
> Are the SVN developers aware of any quirks in SVN in regards
> to storing a large number (say, 1000) very large binary files
> (say, 1GB each)? So, the entire repository would be 1TB of size,
> but my concern is not space, but rather whether SVN would have
> difficulties (either on the server side or the client side) handling
> such repositories/workspaces. Thanks for your replies.
I can't give a very definite answer, but, assuming your server and
network and client hardware is adequately sized for the task, I would
* No problem handling many thousands of files in total.
* Putting 1000 or more files all in the same directory can give poor
performance in time and/or space, on both client and server, so avoid
* Subversion can handle "binary" files as large as 1 GB or even many
GB. There used to be a 2 GB limit when using old versions of Subversion
in certain configurations, but that limit is long gone. I have heard
that the server can be very slow if a new version of a very large file
consists of a completely different bit pattern from the previous
version. If your files don't change much, or if you only add new files
and delete old files instead of checking in modifications to the
existing files, that won't be an issue.
* Subversion is designed to process large amounts of data "streamily"
without trying to read it all into RAM at once, so you shouldn't need
excessive amounts of RAM.
I hope that helps. Please let us know what results you get. If you do
run into any problem we'd like to know about it and try to fix it.
Received on 2011-05-23 13:37:19 CEST