[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

RE: how large data will affect performance?

From: Cooke, Mark <mark.cooke_at_siemens.com>
Date: Tue, 9 Oct 2012 10:21:15 +0100

> -----Original Message-----
> From: Thorsten Schöning [mailto:tschoening_at_am-soft.de]
> Sent: 09 October 2012 10:17
> To: 'users_at_subversion.apache.org'
> Subject: Re: how large data will affect performance?
>
> Guten Tag wang.yu,
> am Dienstag, 9. Oktober 2012 um 03:35 schrieben Sie:
>
> > I have a SVN server on windows2003.
> > Now the developer want check in about 10G date to it, and in the
> > last few months, they will check in about 100G data.
> > Will so many data affect on server’s performance?
>
> This depends heavily on what the data is used for after the commit. If
> it is only committed once and after that never ever read or updated or
> else, the commit itself will consume resources while processing it,
> but afterwards the version only blocks the space it needs on your
> hard drive and maybe is never accessed again. Things like dump and
> load cycles and such maintenance work on the repo itself is affected,
> of course.

Just a thought but... depending on what the data is, you could consider creating another repository for that data (and use an svn:external to pull whatever is required into a working copy). That might make maintenance of your exist repository(s) easier in the future.

~ mark c
Received on 2012-10-09 11:21:55 CEST

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.