[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Best Practices: Performance on Large Repositories?

From: Eric Gillespie <epg_at_pretzelnet.org>
Date: 2005-03-16 01:28:49 CET

Steve Seremeth <subversion@seremeth.com> writes:

> We have a repository of about 11000 files totalling about 1200 MB.

The repository is perfectly happy with data sets far larger than
that. The issue is with large working copies. Crawling a
working copy is very expensive.

> A commit or a status done at the trunk dir level takes as much
> as 30 minutes whether using the svn command line (Windows XP or
> AIX)

We have working copies of 70,000 files, and it "only" takes 10 -
15 minutes to crawl them. Maybe the difference is accounted for
by different operating systems (Linux, NetBSD, and FreeBSD here).

> What is the best way to improve performance?

Don't let svn crawl your working copy; tell it what to commit
('svn commit foo/bar/baz.c' vs. 'svn commit').

--
Eric Gillespie <*> epg@pretzelnet.org

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Wed Mar 16 01:31:14 2005

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.