On 8/1/06, Philip Martin <philip@codematters.co.uk> wrote:
[snip]
> >
> > I'd be happy to handle it better in the log running code, though, if
> > we can. Or maybe update should just run cleanup automatically or
> > something?
>
> The whole point of cleanup is that it's not supposed to be run
> automatically, something has gone wrong that requires user
> intervention. In this case running cleanup automatically would surely
> hit the same file locked problem?
Well, presumably the update would fail the first time and they would
get the clue and shut down the offending application. Then they
update again and cleanup would get automatically run without the user
manually doing it. But I agree that that is not a great solution --
just something I threw out (I think I saw it proposed by someone else
a while ago, actually, but I'm not sure).
>
> Your solution might be the best way to do it, although if you care
> about the behaviour you should write a regression test as it's easy to
> break corner cases like this.
Yes, I will do that.
>
> Alternatively perhaps you could extend the log file processing so that
> it does all the file handling while writing the log file, then it will
> fail before running the log file.
>
Could you elaborate on this a bit more? Do you mean open and hang
onto the files in add_or_open_file (or around there) and then process
them using the already opened objects (rather than re-opening) in
close_directory/run_log? Wouldn't that cause unacceptable memory
usage (or open file limit) problems with large directories? It would
be perfect from the standpoint of solving this type of problem,
though...
Or were you thinking of something else?
DJ
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Wed Aug 2 04:22:56 2006