[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

My Backup Script

From: Andy Canfield <andy.canfield_at_pimco.mobi>
Date: Tue, 26 Jul 2011 13:33:09 +0700

For your information, this is my backup script. It produces a zip file
that can be tranported to another computer. The zip file unpacks into a
repository collection, giving, for each repository, a hotcopy of the
repository and a dump of the repository. The hotcopy can be reloaded on
a computer with the same characteristics as the original server; the
dumps can be loaded onto a different computer. Comments are welcome.

#! /bin/bash

# requires root access
if [ ! `whoami` == root ]
then
     sudo $0
     exit
fi

# controlling parameters
SRCE=/data/svn
ls -ld $SRCE
DEST=/data/svnbackup
APACHE_USER=www-data
APACHE_GROUP=www-data

# Construct a new empty SVNParent repository collection
rm -rf $DEST
mkdir $DEST
chown $APACHE_USER $DEST
chgrp $APACHE_GROUP $DEST
chmod 0700 $DEST
ls -ld $DEST

# Get all the names of all the repositories
# (Also gets names of any other entry in the SVNParent directory)
cd $SRCE
ls -d1 * >/tmp/SVNBackup.tmp

# Process each repository
for REPO in `cat /tmp/SVNBackup.tmp`
do
     # some things are not repositories; ignore them
     if [ -d $SRCE/$REPO ]
     then
         # back up this repository
         echo "Backing up $REPO"
         # use hotcopy to get an exact copy
         # that can be reloaded onto the same system
         svnadmin hotcopy $SRCE/$REPO $DEST/$REPO
         # use dump to get an inexact copy
         # that can be reloaded anywhere
         svnadmin dump $SRCE/$REPO >$DEST/$REPO.dump
     fi
done

# Show the contents
echo "Contents of the backup:"
ls -ld $DEST/*

# zip up the result
cd $DEST
zip -r -q -y $DEST.zip .

# Talk to the user
echo "Backup is in file $DEST.zip:"
ls -ld $DEST.zip

# The file $DEST.zip can now be transported to another computer.
Received on 2011-07-26 08:34:18 CEST

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.