[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: 'stamping' a la RCS '$Header$'?

From: Karl Fogel <kfogel_at_newton.ch.collab.net>
Date: 2002-05-03 05:41:13 CEST

cmpilato@collab.net writes:
> Yeah, see the problem here is that you set your format string property
> on trunk. But what if I only check out some sub-tree under trunk?
> This scenario will inevitably lead to setting your format string
> property on every directory (since you don't know what portion of the
> tree folks will check out). But if you're going to be setting the
> same property so many times, why not set it instead only the files
> that want to expand it?

But maybe the format strings could be configured in some text file in
the repository (I mean "in the repository" in the sense that the hook
scripts are in the repository), and the client would fetch that file.
That way, the custom keywords would be global to that repository,
which matches the administrative boundaries that exist in real life.

The client and server would have to checksum and/or datestamp the
file, to get it when it changes. Or just refetch it every time.

This seems like a lot of work. On the other hand, a lot of people
were asking for support for customizable keyword expansion. On the
*other* tentacle, it's a slippery slope into becoming a generic text
transformation tool as well as a revision control system.

I think this would be our best course:

Implement "$Id$" for 1.0. It will be rev, author, and date.

If anyone submits a patch for customizable keywords, one that looks
like it'll work robustly, we should use it. If not, we can still do
it (perhaps using the scheme described above), but post-1.0.

-Karl

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Fri May 3 05:40:29 2002

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.