[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Complex tag creation performance

From: Robbie Gibson <robbiexgibson_at_gmail.com>
Date: 2006-06-27 16:47:18 CEST

On 27/06/06, Robbie Gibson <robbiexgibson@gmail.com> wrote:
> Hi,
> We have a simple SVN setup where everybody works in the trunk. When we
> want to put our work into the production system we do a selective update of
> the files concerned - that is, we work on a file-by-file basis and hardly
> ever do a global "svn up" on the production systems. To keep track of
> production versions we create a complex tag with svn cp WC URL. This works
> fine, except that after a few months we started noticing a slowdown in the
> tag creation.
> Investigations showed that the "svn cp" was checking the version of every
> file to find the oldest revision and then using that as the basis for the
> tag, plus changes since. As we never do a global update and some files
> seldom change, the oldest revision could be 6 months ago - even if every
> file is completely up to date!
> To get over the problem I have written a small script which "updates"
> files with no pending changes to the latest version (i.e. my script
> updates the metadata for up-to-date files and directories). It's a bit slow
> but it seems to help. I have three questions:
> 1) Is this behaviour described by design, or is it a bug?
> 2) Does anybody have a better suggestion for how to resolve my problem?
> Thanks,
> R

Forgot the third question!
3) In order for my script to have any effect, it needs to update the
meta-data for directories as well as files. But updating a directory always
seems to update the files within it, even if the -N option is specified. Is
there any way round this?
Received on Tue Jun 27 16:49:53 2006

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.