Greg Stein wrote:
>>Yes. The way we currently lose revisions during conversion is bad (plus
>>it breaks my tagging/branching code).
>
>
> I don't doubt it :-) In any case, that problem has been fixed. A stupid
> error on my part a while back.
I am not sure that it has been fixed. The commits on the same file by
the same person within the COMMITS_THRESHOLD look like they are still
getting combined.
>>>even though that may not reflect what *really* happened. I would state that
>>>we should issue a warning so that a human can investigate. Ideally, we could
>>>provide controls so that a human could separate an interleaved commit from
>>>the same user/message.
>>
>>For now, I will remove the last condition in this part of the patch:
>>
>> >> # scan for commits to process
>> >> process = [ ]
>> >> - for id, c in commits.items():
>> >> - if c.t_max + COMMIT_THRESHOLD < timestamp:
>> >> + for xid, c in commits.items():
>> >> + if c.t_max + COMMIT_THRESHOLD < timestamp or \
>> >> + c.has_file(fname) or \
>> >> + xid != id:
>
>
> Euh... didn't we just get done stating that c.has_file() is (also) not an
> appropriate condition for this test?
It tries to solve the problem above.
Mark
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Thu Oct 24 07:39:18 2002