I have a system with ~30 users, with each having an 8G or so svn working
copy. This is too much data to back up on a poor little LTO-2 drive
shared with many other systems. (The repository is of course backed up;
I'm talking about vast numbers of working copies.) If it were all
useful data, then we'd get bigger tape drives or something else. If it
were all just copies, we'd not back it up. But sometimes, somewhere in
that 8G is 50KB of real work that hasn't been committed yet.
Does anyone know of a way to integrate svn and backups (GNU tar because
this system is GNU/Linux, but many of my others are NetBSD with dump) so
that modified files are backed up, and unmodified files are not? I am
thinking of running a cron job that does 'svn status' and generates an
exclude list (or setting chflags nodump) based on what is modified, or
really what isn't known to match the repo or set to ignored.
It would be cool to have svn modify the exclude list on the fly, perhaps
storing it in a database. This seems tricky because modifying a file
for the first time is not necessarily an svn operation, so think the
cron job approach is probably best.
------------------------------------------------------
http://subversion.tigris.org/ds/viewMessage.do?dsForumId=1065&dsMessageId=2376637
To unsubscribe from this discussion, e-mail: [users-unsubscribe_at_subversion.tigris.org].
- application/pgp-signature attachment: stored
Received on 2009-07-29 17:16:04 CEST