[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Why rewrite the Subversion working copy?

From: John Peacock <john.peacock_at_havurah-software.org>
Date: Wed, 23 Apr 2008 12:56:01 -0400

David Glasser wrote:
> For the lowest layer, I've started implementing a prototype in Python
> of a low-level Subversion metadata store, with full unit tests and all
> that jazz. Currently I've designed the API for a refcounted blobstore
> and am working on tree and treestore abstractions. I'll put it
> somewhere (http://svn.collab.net/repos/svn/experimental/svnws? gvn
> repository? whatever) sometime soon, once I've got a little more
> implemented. The key goal here is to create a *non-brittle* working
> copy, where we don't have to be scared that typing "svn switch" with
> the wrong URL will corrupt the working copy irretrievably. Efficiency
> is nice too. (Hopefully, while the code itself will probably need to
> be backported to C, the tests might end up being executable against
> the "real" code.)

I've been actually wondering for a while why Subversion couldn't just
use the existing server repo code for the client meta-data. A mini
local repo of configurable depth (up to and including a full mirror).
Updates would use svnsync-like operations to update the local repo
before updating the working copy itself. Revs outside of the configured
window would get purged automatically (or a config option could let them
just build up over time). A zero-depth local repo would require server
access for all operations and would still have just the properties and
checksums locally (equivalent to all of the file contents forced to zero
size), so enormous checkouts would not be 2x as large any longer.

All operations would be performed against the local "repo" when possible
(i.e. if you are disconnected, commits go against the local repo and are
resolved when reconnected) and against the server when required (you
only have the last 10 revs locally, so asking for a full history
requires a server roundtrip, just like it does now).

Each local metadata repo would effectively be a branch of the main repo
(so multiple remote repos would have multiple local branch/mirrors), and
the existing merge-tracking code can be reused to facilitate conflict
resolution caused by offline operations.

I've got some handwritten notes at home about some of the other
implications of this, but my home machine is in between operating
systems at the moment, so I've been very much out of the loop for a
while. Is this something that I should try and flesh out into a more
robust proposal???

John

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe_at_subversion.tigris.org
For additional commands, e-mail: dev-help_at_subversion.tigris.org
Received on 2008-04-23 18:56:22 CEST

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.