[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: [Subclipse-users] 0.9.106: File lost after update

From: Mark Phippard <markp_at_softlanding.com>
Date: 2006-02-25 21:21:10 CET

Mark Phippard <markp@softlanding.com> wrote on 02/25/2006 12:29:10 PM:

> > well, I am *quite* confident that no one deleted the file in between
like
> you
> > describe. We are working in a team of 3, and at the time in question
> there were
> > only 2.
>
> In my previous reply I said that I doubted that is what happened, just
that
> the behavior would be consistent with that scenario. You can waste time
> being right, or you can provide some information to help. I still need
to
> see the Console output from when the original update occurred and the
> output of Show in Resource History for the parent folder would also be
> relevant.

I think I have figured this out. I do not have the greatest setup at home
right now so I do not know if I will be able to confirm it until Monday. I
think this problem can happen quite easily when doing an update from the
Synchronize view. The problem is that any solution is going to require
some other tradeoffs.

Here is the scenario, when you do Update from the Synch view we try to
update to the revision that is shown in the view, as opposed to HEAD. We
could just use HEAD but then in theory you could spend a lot of time
working out conflicts only to do an Update and get a different set of
changes then what you were looking at.

The second issue was performance. At one point we were doing a series of
individual updates. This is slower than just doing a recursive update of
a folder so we found a way to do the latter. The other issue with
individual updates was that the folders themselves never get updated which
causes problems later if you want to refactor code or set a folder
property.

So what I think is happening is that you do a Synchronize and it has a
mixture of Incoming and Outgoing changes. Let's say the highest incoming
revision is 100. You then do a commit first, this updates the committed
files in your WC to revision 101. You then select a root folder and do an
update. Our code sees revision 100 as being the highest revision so we
do:

update -r 100 /SomeFolder

The problem is that this command will update everything to revision 100.
So the files that were committed are put BACK to revision 100. Personally,
I think it is incorrect for update to do this, it ought to skip over the
revision 101 files and force you to use the switch command if you really
want to put everything at revision 100. Regardless, there is nothing we
can do about the fact that it works this way.

This should be easy to test/verify if someone wants to, but I am fairly
certain that this is what is happening.

At the moment, I cannot think of a good way to approach fixing this. I
will probably put the code back to updating to the HEAD revision as I
think it has the least negative side effects. Or maybe I will do
something like if you select a folder and do Update, I will use HEAD, if
you select a specific file, I will use the revision of the file, or the
highest revision in the set of files.

Thanks

Mark

_____________________________________________________________________________
Scanned for SoftLanding Systems, Inc. and SoftLanding Europe Plc by IBM Email Security Management Services powered by MessageLabs.
_____________________________________________________________________________

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subclipse.tigris.org
For additional commands, e-mail: users-help@subclipse.tigris.org
Received on Sat Feb 25 21:21:27 2006

This is an archived mail posted to the Subclipse Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.