I'm deploying Subversion in an environment where all work will be done
within a single repository and nearly all files will be binary files
10s of MB in size (graphics files mostly).
I'm wondering if there's any potential problems with having a very
large repository, say 10s or possibly even 100s of GB. I'm thinking of
things like performance and database corruption. The OS should handle
files up to 16TB. I've noticed that Berkley DB doesn't split out the
database files even when they get to be very large. Is it possible
that things will slow with time or that the database files will become
"fragmented", requiring a lot of seeking?
Is there any problem that I might be overlooking by using Subversion
to control binary-only data? Obviously, I don't plan to do any merging
of files. Rather I just want to be able to version project folders and
generally have more structure, collaboration and control.
This will all be on a Windows 2003 Server machine (no choice!). Given
that I'm using the latest stable version of everything (svn,
apache2+mod_auth_sspi) does anyone foresee problems using this OS in a
production environment.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Tue Mar 22 15:46:11 2005