[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: How to change paths on an external file without a full update --depth infinity?

From: <dlellis_at_rockwellcollins.com>
Date: Thu, 15 Aug 2013 10:39:05 -0700

> > The challenge I then see on this is one of finding all instances of
foo.c.
> > If you have foo.c copied/forked fifty times to different projects,
each of
> > which has branched a couple of times, how do you programmatically find
all
> > different instances of foo.c (to let a developer choose which may be
most
> > appropriate?) If you have good ideas, I'm very open to listening.
>
> There is no difference in that question than finding where the
> 'future' copies of a pegged external target went. You can only do
> either if you have a convention for a canonical path.

true (I believe).

>
> > Also if you have to projects that both want foo.c and both have valid
> > changes to make to the file, how does that get managed when they are
copies?
> > Its a trivial implementation when it is implemented as a file
external.
>
> How so? I assume you also have to handle cases either way: where both
> projects want the same change and where both projects need different
> changes - where typical svn users would have branches/tags to
> distinguish them. But regardless of how you identify the target
> file, there shouldn't be any effective difference between copying a
> version into your directory or using a file external as long as you
> don't modify it in place and commit it back - something your external
> tool could track.

We do want to modify in place. Copying back creates an additional step
that is already managed quite well by SVN with externals. I don't want to
duplicate something that already exists (I'll mess it up). There was some
discussion on another thread about advancing a peg revision of an external
when an external is committed. This would be a neat feature, though I
completely understand why it would not be incorporated. We do this behind
the scenes right now with very little work (post-commit script in TSVN
gives us knowledge of what a user committed).

>
> > We also have instances where we purposely want multiple copies of the
same
> > exact file within the same project. We can effectively manage this
through
> > file externals to a structured "datastore" (AKA a set of folders
within a
> > repo). Regardless of where and how a team decides to structure their
> > project, all files are neatly organized in this one section of the
repo
> > (that is considered taboo to directly interact with). The abilityto
have a
> > specific file having many "copies" of itself and not care about its
position
> > within the repository is a powerful feature. I understand this may
diverge
> > a bit from SVN's core thoughts on CM, but if SVN can support odd
variations
> > to its use, it becomes an even more indispensable building block.
Diversity
> > in approaches is good.
>
> Again, you get the history in a copy. You can tell if they are the
> same. Or, on unix-like systems you can use symlinks to a canonical
> copy within the project.

We're not a unix-like system but that is what would work great (with the
exception that you can't revision control symlinks, right?)

>
> > From a feature perspective, externals are a very appropriate method to
> > accomplish this (really a CM implementation of symlinks). If we're
saying
> > that externals from an implementation standpoint are not quite
appropriate
> > at this time, I get that argument. What is the general consensus as
to
> > where externals are on the roadmap?
>
> I agree that externals are very useful, but most projects would use
> them at subdirectory levels for component libraries where they work
> nicely, not for thousands of individual file targets. Is there
> really no natural grouping - perhaps even of sets of combinations that
> have been tested together that you could usefully group in
> release-tagged directories?

The whole discovery we found is that most of our reuse occurred in
unplanned ways (I'd imagine if you took two linux distros and compare
files which changed and didn't change, it would be a huge collection of
random files that aren't easily abstracted out. You might be able to do
it once, but as each new distribution branches out, the commonality
between each of them becomes impossible to form groupings on.

> ...'just work' without special considerations. I'm not against better
> performance, of course, but it makes sense to me to make pragmatic
> design decisions for the same reasons you might avoid throwing
> millions of files in one flat directory even in a non-versioned
> scenario. Theoretically, you should be able to do that, but in
> practice it isn't going to perform as well as something with better
> structure.

I'm not sure what a reasonable number of external files per folder is, but
I'd think it'd be similar to a reasonable number of regular files would
be. Two million is nuts, but 50 seems reasonable. The issue is that I'm
currently forced to deal with not just the current directory, but the
recursion on all nested directories (--depth infinity). If, as the
subject of this thread requests, we could perform work on the directory at
hand at not the full checkout, we're golden!

I do appreciate this discussion.

Thanks
Dan
Received on 2013-08-15 19:39:50 CEST

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.