On Mon, Jul 11, 2011 at 10:21 PM, Daniel Shahaf <d.s_at_daniel.shahaf.name> wrote:
> Geoff: you cannot point a single working copy item at more than one URL.
> (well, unless you create two externals with the same target file. Don't
> do that.)
>
> Nico: explain /exactly/ what you have been doing (best: a script(1)
> transcript). I don't know if you are complaining about nested working
> copies, or about running svn co $URL $dir where $dir is a subdir or root
> of a working copy, or something else altogether.
Oh, it's not *me* doing it. Someone, as part of their software build
environment, is doing the moral equivalent in their setup scripts of
this:
svn checkout $URL $targetdir
make -C $targetdir install
Then, for testing, they do this
cd $targetdir
# Edit local files, not necessarily submitted to
upstream repository
Then a day or a week later, they re-run the script.
svn checkout $URL $targetdir
make -C $targetdir
cd $targetdir
Notice that the edited files in $targetdir may not be committed, and
that other issues can occur of the $URL is altered or if the generated
contents inside the old working copy conflict with the fresh
Subversion checkout. It's why I'm looking at it cand thinging "do
this"
cd "`dirname "$targetdir"`" && \
rm -rf "`basename "$targetdir"`" && \
[ ! -e "$targetdir" ] && \
svn checkhot $URL "`basename "$targetdir"`" && \
make -C $targetdir
You may also notice that I'm completely anal about checking that the
target is actually removed, and that directory names with oddball
characters such as spaces in them get handled correctly. It gets
trickier if you don't have write access to `dirname "$targetdir"`
Received on 2011-07-12 05:43:06 CEST