> From: Steve Greenland [mailto:steveg@lsli.com]
>
> So you want to commit binary targets so that you don't have
> to rebuild them, but you want obliterate so that you can
> completely delete the binary targets from the SCM.
But only for the oldest revisions as the disk space is used up by the
repository.
> You mentioned that you have a wrapper that filters out the
> targets for those who don't want them.
It filters out any directory structure not wanted. For instance, not
everyone is a programmer where I work. Why should the non-programmers
require the source code? So, they can filter that out of their working
copy.
> Instead of shoving all
> the targets into subversion, why not wrapper that copies the
> binaries to another location, marked with the corresponding
> subversion revision number?
So, for fun, let's run some numbers. There are nearly 100 commits a day on
the Project tree we are talking about. The Final Target assets currently
consume 2.5 gigabytes (and will grow). That means I require 100 revision
directories on a server at 2.5 gigabytes a piece. That's 250 gigabytes a
day. The thing is, not all 2.5 gigabytes of the target content changes in a
given day. Some revisions don't update the target content at all. Some may
do a few megabytes.
Further, having this content in Subversion allows an 'svn update' to pull
down just the changes quickly. Adds/deletes are automatically taken care of
during the run of the command. The working copy stays pristine, the targets
can be built in place, and everything is committed easily.
> It's sounds like what you really
> want is a cache of built targets, and it might be easier to
> implement it that way, rather than overloading the
> *source* control system. You can manage the cache by hand
Nah... I'm pretty sure I want exactly what I described... ;)
Subversion handles the gigabytes of the content with ease. Other commercial
providers, such as Perforce and Alien Brain, also handle it with ease. In
fact, it's a very common practice to do what we're doing, allowing the
underlying SCM to snapshot each and every build. Managing an external cache
for something the SCM does naturally just creates unneeded headaches. The
issue here is not Subversion's ability to handle the content, but rather,
Subversion's ability to REMOVE old, stale, unnecessary (insert term here)
content.
In any case, your idea does bring up a system we once tried to create with
SCons. SCons, as you might be aware, has a fantastic network cache system,
and targets can be retrieved from source assets without even having to run
the tools to build the target if the target exists in the cache. In
practice, it seemed to work well, until we found that disk space was being
consumed orders of magnitude faster than the SCM consumed it. Why? It was
simple... for each change to the source asset, a target was generated for
testing in the application. The target wasn't necessarily the one committed
to the repository. It was an "intermediate" build, if you will. All of the
intermediate builds of the targets were ending up in the cache, and there
was no way to determine what was intermediate and what was final. I still
like SCons, but the cache has to be seen as an extremely volatile entity,
capable of being destroyed at any given point. And if one does that, then
backing up through past revision history yields source assets without their
accompanying target assets.
Thanks for your help.
Josh
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Tue Apr 12 17:47:33 2005