[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Assertion failed and crash with 1.7.1

From: Daniel Shahaf <d.s_at_daniel.shahaf.name>
Date: Thu, 17 Nov 2011 21:02:30 +0200

Johan Corveleyn wrote on Thu, Nov 17, 2011 at 13:55:51 +0100:
> On Thu, Nov 17, 2011 at 1:02 PM, Philip Martin
> <philip.martin_at_wandisco.com> wrote:
> > Attila Nagy <bra_at_fsn.hu> writes:
> >
> >> On 11/16/11 18:40, Philip Martin wrote:
> >>> Attila Nagy<bra_at_fsn.hu>  writes:
> >>>
> >>>> I use pysvn for this and basically the code looks like this (in python):
> >>>> def update_perms():
> >>>>      for path in propchg:
> >>>>          proplist = svn.propget('file:permissions', path)
> >>>>          if not os.path.islink(path) and proplist.has_key(path):
> >>>>              set_perms(path, proplist[path])
> >>>> svn.update(walkroot)
> >>>> update_perms()
> >>>>
> >>>> The svn update collects the changed entries (propchg) and update_perms
> >>>> iterates on them and gets their file:permissions property and sets it
> >>>> in the file system.
> >>>>
> >>>> And this is what takes ages (literally), compared to 1.6.
> >>>> Any ideas about what could be done in this topic?
> >>>
> >>> It might be faster to run a recursive propget, which is a single
> >>> transaction, and discard the output if it doesn't match one of the
> >>> changed paths.
> >>>
> >> I will try this. Should this be true even for 10+ million files?
> >
> > It depends on the ratio of changed files to total files.  If there is
> > only one changed file then the single propget will be faster.  If most
> > of the files are changed then the recursive propget will be faster.
>
> Yes, you'll need to test that a bit.
>
> I do something similar here with a script that fetches all the log
> entries for merged revisions (cherrypick merges, which I look up with
> 'svn mergeinfo --show-revs merged'): if the number of merged revisions
> is low relative to their range, I perform a series of individual 'svn
> log' requests. Otherwise, I do a single 'svn log -r<min>:<max>'
> request, parse the output and discard the entries that are not
> relevant. This made my script much faster in most cases.
>
> This has nothing to do with recursive propget (or even with 1.7, I'm
> using this in a 1.5 environment), but I'm just noting the similarity
> of the problem here.

svn log -r N -r M -r P

was implemented a few years ago...

> --
> Johan
Received on 2011-11-17 20:03:17 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.