> -----Original Message-----
> From: Thorsten Schöning [mailto:tschoening_at_am-soft.de]
> Sent: 09 October 2012 10:17
> To: 'users_at_subversion.apache.org'
> Subject: Re: how large data will affect performance?
>
> Guten Tag wang.yu,
> am Dienstag, 9. Oktober 2012 um 03:35 schrieben Sie:
>
> > I have a SVN server on windows2003.
> > Now the developer want check in about 10G date to it, and in the
> > last few months, they will check in about 100G data.
> > Will so many data affect on server’s performance?
>
> This depends heavily on what the data is used for after the commit. If
> it is only committed once and after that never ever read or updated or
> else, the commit itself will consume resources while processing it,
> but afterwards the version only blocks the space it needs on your
> hard drive and maybe is never accessed again. Things like dump and
> load cycles and such maintenance work on the repo itself is affected,
> of course.
Just a thought but... depending on what the data is, you could consider creating another repository for that data (and use an svn:external to pull whatever is required into a working copy). That might make maintenance of your exist repository(s) easier in the future.
~ mark c
Received on 2012-10-09 11:21:55 CEST