[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

RE: Mirroring a Subversion Server

From: Gary Thomas <gary_at_mlbassoc.com>
Date: 2005-09-09 15:34:32 CEST

On Fri, 2005-09-09 at 09:16 -0400, Rowell, Geoff wrote:
> Igor Zevaka [mailto:igor.zevaka@cognethos.com] wrote:
> > Our company is changing from SourceSafe to SubVersion. The problem is
> the > subversion server is located somewhere else and we need to access
> it using > a slow Internet link. Is it possible to set up a caching
> SubVersion server > that we can work with locally and get it to update
> the slow server only if > a change has been made. Alternatively we need
> a mechanism that will write > out the new files with history
> periodically.
>
> I do something similar by maintaining a "warm" backup of my company's
> code repository. I use two scheduled jobs (that run every five minutes)
> to create, and consume, revision dumps. The job scripts are built around
> the "svnlook", "svnadmin dump" and "svnadmin load" commands.
>
> The create job runs on the local repository server and produces one dump
> at a time - when the youngest revision is greater than the last dumped
> revision. I record the last dumped revision number in a dump directory
> file.

Why not just do this in the post-commit hook - that's how I've solved
this problem.

>
> The consume job runs on the remote repository server and loads all
> dump files beyond its youngest revision.
>
> I use a network share to transfer the dump files, but I see no reason
> why some other transfer method couldn't be added to the create script.
>
> Just to be really anal, I archive all the dump files.

A very sensible strategy, indeed.

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Fri Sep 9 15:36:59 2005

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.