[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Sparse checkouts suggestion

From: Johan Corveleyn <jcorvel_at_gmail.com>
Date: Fri, 6 Oct 2017 00:24:30 +0200

On Thu, Sep 21, 2017 at 12:16 PM, Paul Hammant <paul_at_hammant.org> wrote:
>>
>>
>> 2. Previously I forked a medium-sized Monorepo on Github, and did the the
>> complete expand/contract work for it -
>> https://github.com/paul-hammant-fork/jooby-monorepo-experiment - in Python.
>>
>
> LOGIBALL's Tim Krüger has just written about a deployment of an
> expanding/contracting monorepo for Git and Maven based on the above proof of
> concept: https://timkrueger.me/a-maven-git-monorepo/ (I'd helped him with
> the article).

FWIW, at my company we're actually working on something similar for
SVN + Ivy (we have a monorepo too, in svn, with all of our projects
together under /trunk, and sparse working copies for developers and on
our build servers).

We want to have automatically expanding/contracting sparse working
copies, based on the ivy dependencies between modules (starting from a
given "main module"). So far I've written a written a hackish python
script that finds all "ivy.xml" files in our tree as quickly as
possible (up to a certain directory level), 'svn cat's them, and
invokes an ant-ivy task to determine a list of the needed modules for
the given main module (in the form of relative paths in svn where
those ivy.xml's reside).

What's still left is "applying" that list of subdirs to a given
working copy (or checking out a new one with these subdirs). So here
too, I'd like to have a view(spec) feature that can handle the output
from our "ivy dependency tree" script as input, and adapt the wc
sparsity to it (or checkout a new one with the given sparsity).

For now I've had to put my work on that auto-scoping functionality on
hold (other priorities), but I hope to get back to it someday.

-- 
Johan
Received on 2017-10-06 00:24:55 CEST

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.