[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

how to repeatedly checkout sparse directories in a cron job

From: Mojca Miklavec <mojca.miklavec.lists_at_gmail.com>
Date: Tue, 20 Dec 2011 13:08:50 +0100

Hello,

I'm writing a script (also used as a cron job on the server) that
should work independent of whether or not a checkout has already been
done or not. I would like to do "repeated sparse checkout" as
explained below, but I'm not sure how to do it properly. (I have some
kind of a workaround for this problem, but I would like to know if
there is a more elegant solution to this.)

Repository has the following structure:
- project/trunk (I would like to have it checked out)
- project/branches/xxx (I don't want them)
- project/tags/xxx (I would only like to have the latest one, but it
doesn't bother me too much if I don't delete older ones)

The first time I can do the following:
    latest=`svn list $URL/tags | grep beta | tail -1 | tr -d '/'`
    svn co --depth=empty $URL
    svn up project/trunk
    svn up --set-depth empty project/tags # temporary workaround for a
bug in svn
    svn up project/tags/$latest

I could use "svn up" from that moment on, but since I would like the
script to work even if nothing has been checked out yet, I would like
to keep "svn co" in the script.

The problem is that "svn co --depth=empty $URL" will delete all the
contents next time when I call it. Is there any way to prevent
deleting existing contents within the scope of command line arguments?
A workaround is to use something like
    if [ ! -d "project" ]; then
        svn co --depth=empty $URL
    fi
but I would be really happy if there was some command like:

    "please checkout $URL, but no need to fetch any files yet, in
particular don't fetch 'branches'; on the other hand please don't
delete 'trunk' and 'tags' if already present"

Thank you very much,
    Mojca
Received on 2011-12-20 13:09:26 CET

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.