[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

how to deal with HUGE repositories?

From: Subversion Newbie <subversionnewbie_at_yahoo.com>
Date: 2004-11-16 00:16:42 CET

We have website that currently contains about 94000
files (in 7200 directories) for a total of about 1.6
GB. Our plan is to move all this to an SVN-based
repository scheme.

Our proposed scheme is to have a "work" area (from
which users check stuff in and out via TortoiseSVN)
and another "web" area which is essentially Apache's
DocumentRoot. Users will check files in/out from
"work" into the repository, and a script will
periodically "svn update" from the same repository
into the "web" area so that the updated files are
visible to Apache.

Currently, the "svn update" into the "web" area takes
WAY too long, for the obvious reasons. Our goal is
for "svn update" to take no more than two or three
seconds, and through experimention we've broken up the
big tree of 94000 files / 7200 directories into a
number of subtrees, and created a separate repository
for each one of those. This will work in the sense
that the "svn update" for each one is fast enough, but
it's getting a bit unwieldy as we sub-divide more and
more of the directories that contain too many files
and subdirectories.

My question is, does this scheme make any sense? Is
there a better way to implement what appear to be
"sub-repositories" of one big collection of files in
such a way that the response time for "svn update" is
as fast as possible? Thanks for your opinions.


Do you Yahoo!?
Meet the all-new My Yahoo! - Try it today!

To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Tue Nov 16 00:17:09 2004

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.