[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: revert --force

From: David Weintraub <qazwart_at_gmail.com>
Date: Thu, 10 Dec 2009 12:48:31 -0500

On Thu, Dec 10, 2009 at 10:52 AM, Tobias Hahn <tobias.hahn_at_ableton.com> wrote:
> I am frequently unnerved by the fact that revert does not remove unversioned files and directories. I understand that protecting them from accidental deletion is often desirable, but not always. My suggestion is therefore to add a --force option to revert which does a true revert: It reverts the working copy without trying to be smart and just erases everything what's not in the repository.
>
> Here's my use case:
> - I'm working on a branch.
> - I have a path in my working copy which is modified by some automatic scripts.
> - About once a week, I merge trunk. To do so, I need a clean working copy. To get it clean, I have to
> - manually delete all these files (slow and error prone) or
> - delete the path and checkout again (very slow)
> - just revert --force it (wishful thinking).
>

You do not want your version control system removing files
willy-nilly. Imagine if someone has private files that contain data
used for testing or configuration. Doing a clean revert would get rid
of these files too. And, since they're not revisioned, they'll be
permanently lost.

I can imagine dozens of developers coming to my desk wanting to know
how they can get back these files that "Subversion deleted". Since
they're not versioned and not stored in Subversion, I'd have to tell
them that they are lost forever unless they backed them up somewhere
else.

Then, I have to listen to them tell me that Subversion is no good, and
the old version control system (ClearCase, Git, SCCS) was so much
better. It doesn't matter if to get rid of all of these files you have
to do:

$ svn revert --get-rid-of-all-nonversioned-files-even-those-I-might-want-to-keep

to do a complete clean. People will do that, and then blame Subversion
and me since I am the Subversion admin for their mistakes.

There are several ways you can handle this:

* Have a "clean" target in your build system that gets rid of all
build artifacts. I configure my build systems to put all artifacts
under a "target" directory. Cleaning up builds is simply a matter of
removing this directory.

* Use two working copies and have both of them around at all times.
Use one for merging and one for working. The merging working copy
would always be clean. You simply have to do a "svn update" and then
"svn merge".

* Use a real operating system and not Windows. Okay, Windows is a real
operating system, but it can be slow when copying files to and from a
network. Much of this is due to anti-virus programs that have to check
each and every file that gets created. Subversion makes a lot of files
whenever you do a checkout. It creates a file to track properties, and
files to track changes made from the base revision, so each Subversion
working directory contains 3 times the number of files you're actually
working on. Checkout 100 files, and Subversion creates 300 files. If
your anti-virus system is busy checking each of those files, it will
go through all 300 of those.

-- 
David Weintraub
qazwart_at_gmail.com
------------------------------------------------------
http://subversion.tigris.org/ds/viewMessage.do?dsForumId=1065&dsMessageId=2429267
Please start new threads on the <users_at_subversion.apache.org> mailing list.
To subscribe to the new list, send an empty e-mail to <users-subscribe_at_subversion.apache.org>.
Received on 2009-12-10 18:49:29 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.