[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: how can I use svn for backups of complete directory hierarchies?

From: Jack Repenning <jrepenning_at_collab.net>
Date: 2003-09-08 19:51:36 CEST

At 12:40 AM +0200 9/7/03, R. Welz wrote:

>Yesterday I experimented with svn becourse I am searching for a
>possibility to make versioned backups of my home directory (well,
>under Mac OS X). And here is the problem. svn add complains about
>Umlaute, and decided to leave a .svn directory well, just
>everywhere, even inside every Application (inside the bundle, which
>technically is a directory).

Sadly, SVN does not deal with the Mac OS idiosyncracies, like bundles
or with the resource forks of a HFS+ file.

As you've noticed for the bundles, it spatters .svn/ directories
throughout them. This might actually be OK; sometimes the magic
works, sometimes it doesn't. In particular, if you are a developer
who often generates new bundles, you're gonna lose. Exactly how you
lose depends on the tools you use; some choke on that .svn directory,
while others wipe the whole bundle (including the .svn directories)
and create a new bundle (without the .svn directories).

For the resource forks ("file/rsrc"), it's even worse: SVN just
simply will not find or version these. In this, it's like most other
Unix tools in the OS X world -- tar or cp will also miss them.

>To make the matter short, I am just searching how to delete all
>these (masses) :)

That's easy: open a Terminal window and type this or some variation
you prefer (if this command is unfamiliar to you, use "man find" and
"man xargs" and "man rm" to study and approve what it does before
using it):
        find . -type d -name .svn -print0 | xargs -0t rm -rf

>So what can I do to archieve my aim? I want to commit each file
>type, files and dirs with a prefixed dot (and would be glad to even
>archive resource forks).

Well, if you have resource forks in the data, you darned well *must*
include them in the archive!

>I am experienced in perl and I could write a script for a cron job
>commiting every changed document (I am quite shure that svn does
>this part for me) every 15 minutes.

In order to get the Macintosh-specific bits right (bundles, resource
forks, desktop database flags), you'll need to use something
MacOS-aware, and SVN just plain isn't. Nor is Perl, actually.

At the level of Perl scripting, there's AppleScript: you could make a
"Login Item" AppleScript to copy the whole tree periodically. Or,
you could use an explicit back-up program; I don't know of any
freeware ones, but there are several good commercial ones.

Or, you could use one of those "undelete" utilities, that allow you
to find files after they've been deleted, such as Norton Utilities
(or Norton SystemWorks, which includes Norton Utilities and
Retrospect Express Backup, as well as AntiVirus and Spring Cleaning).
Undelete would be lower overhead for you, which would be important
since you want quite frequent copies made. The flip side is that
undelete is quite invasive; if you upgrade the OS on the computer,
you need to check whether an upgrade is also needed for the
undeleter. Since Macs now come with automatic Software Update for
the OS, this could result in surprises.

Disclaimer: I do not presently use any of these tools or techniques.
I have in the past used Norton Utilities extensively; the only
problems I've had were compatibility problems between a given OS and
a given NU. Butt it has often been the case that I was one of the
many who discovered the incompatibility by upgrading the OS,
encountering problems, and waiting a long time until Norton/Symantec
made their matching update available.

Jack Repenning
CollabNet, Inc.
8000 Marina Boulevard, Suite 600
Brisbane, California 94005
o: 650.228.2562
c: 408.835-8090
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Mon Sep 8 19:52:41 2003

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.