[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: revert --force

From: Daniel Becroft <djcbecroft_at_gmail.com>
Date: Fri, 11 Dec 2009 07:18:08 +1000

On Fri, Dec 11, 2009 at 4:17 AM, Tobias Hahn <tobias.hahn_at_ableton.com>wrote:

> Am 10.12.2009 um 18:48 schrieb David Weintraub:
>
> > On Thu, Dec 10, 2009 at 10:52 AM, Tobias Hahn <tobias.hahn_at_ableton.com>
> wrote:
> >> I am frequently unnerved by the fact that revert does not remove
> unversioned files and directories. I understand that protecting them from
> accidental deletion is often desirable, but not always. My suggestion is
> therefore to add a --force option to revert which does a true revert: It
> reverts the working copy without trying to be smart and just erases
> everything what's not in the repository.
> >>
> >> Here's my use case:
> >> - I'm working on a branch.
> >> - I have a path in my working copy which is modified by some automatic
> scripts.
> >> - About once a week, I merge trunk. To do so, I need a clean working
> copy. To get it clean, I have to
> >> - manually delete all these files (slow and error prone) or
> >> - delete the path and checkout again (very slow)
> >> - just revert --force it (wishful thinking).
> >>
> >
> > You do not want your version control system removing files
> > willy-nilly. Imagine if someone has private files that contain data
> > used for testing or configuration. Doing a clean revert would get rid
> > of these files too. And, since they're not revisioned, they'll be
> > permanently lost.
>
> I do want to and I know I am not the only one :) I agree I would not be a
> good default, but I truly really honestly do want that behavior :)
>
> > * Have a "clean" target in your build system that gets rid of all
> > build artifacts. I configure my build systems to put all artifacts
> > under a "target" directory. Cleaning up builds is simply a matter of
> > removing this directory.
>
> Well, it's test artifacts and not build artifacts, but like I mentioned
> before that would need to be planned tested etc. and would not be trivial to
> implement.

Meh, that's just a technicality. If they are results produced via a build
script (or test script), then are they named such that they can be easily
identified as such? E.g. timestamp or naming conventions, file-types, etc?
It's generally good practice to have scripts that are capable of cleaning up
after themselves (ie ant build, ant clean should give a WC that was
identical to the starting WC).

Are these caching files, or are they continually recreated by the testing
script? If they're created once, then reused, what is the cost penalty for
doing that as opposed to creating a fresh working copy?

> > * Use two working copies and have both of them around at all times.
> > Use one for merging and one for working. The merging working copy
> > would always be clean. You simply have to do a "svn update" and then
> > "svn merge".
>
> Yes, I guess that's possible. But I would prefer having a way to tell
> subversion to revert my working copy to a clean state.

And Subversion can do that by only modifying (reverting) the files that are
under its control.

> OK, you can shoot yourself in the foot with such an option, but do your
> users also come and complain that rm -rf / just erased all of their work?
> Again, I'm not arguing that it's a bug in subversion or that it should
> always behave that way, but rather that I and maybe some more users would
> appreciate this as a feature. After all, it is a best practice to only merge
> clean working copies,

Agreed, that is the best practice ....

> and so IMHO it is not too absurd if svn supported getting there.
>

Do not agree. The best practice is to merge using a clean working copy, and
the generally accepted practice is to use a separate, clean working copy.
The time taken to create the second working copy is a once-off hit, but then
it can simply be a 'update, merge, commit' on the clean WC, and then an
'update' on the 'in-use' one.

> > * Use a real operating system and not Windows. Okay, Windows is a real
> > operating system, but it can be slow when copying files to and from a
> > network. Much of this is due to anti-virus programs that have to check
> > each and every file that gets created. Subversion makes a lot of files
> > whenever you do a checkout. It creates a file to track properties, and
> > files to track changes made from the base revision, so each Subversion
> > working directory contains 3 times the number of files you're actually
> > working on. Checkout 100 files, and Subversion creates 300 files. If
> > your anti-virus system is busy checking each of those files, it will
> > go through all 300 of those.
>
> I'm not sure I understand your point here. I am using Mac OSX and we're not
> talking 300 files, but 60000.
>
> Tobias
>
> Ableton AG, Sitz Berlin, Amtsgericht Berlin-Charlottenburg, HRB 72838
> Vorstand: Gerhard Behles, Jan Bohl, Bernd Roggendorf
> Vorsitzender des Aufsichtsrats: Uwe Struck
>
> ------------------------------------------------------
>
> http://subversion.tigris.org/ds/viewMessage.do?dsForumId=1065&dsMessageId=2429275
>
> Please start new threads on the <users_at_subversion.apache.org> mailing
> list.
> To subscribe to the new list, send an empty e-mail to <
> users-subscribe_at_subversion.apache.org>.
>

------------------------------------------------------
http://subversion.tigris.org/ds/viewMessage.do?dsForumId=1065&dsMessageId=2429593

Please start new threads on the <users_at_subversion.apache.org> mailing list.
To subscribe to the new list, send an empty e-mail to <users-subscribe_at_subversion.apache.org>.
Received on 2009-12-10 22:19:49 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.