On Tue, Dec 2, 2008 at 13:14, Les Mikesell <lesmikesell_at_gmail.com> wrote:
> Andreas.Otto_at_versit.de wrote:
>>
>>
>> * have a 7G repository and creating ~5 tags a day with the whole
>> repository tree included creates ~ 10G *NEW* dump per day (after
>> optimization)
>> * the dumping of the whole repository stops after 1day in time and
>> eat all the free disk space
>> * my solution was to delete all tags and than do just the dump of
>> the latest releases and import these dump again
>> -> but this breaks everything allready checked out
>>
>> -> for me the not availability of "tags" is definitly a stopping
>> point using subversion
>>
>>
>> How I work:
>>
>> I support a life-insurance software using development /
>> test and production lifecycle.
>> everytime I create am software image I create an tag first.
>>
>
> I don't understand how this makes the dump a lot bigger than the repository.
> In some trivial testing, it looks like a copy to a tag from
> already-committed revisions just adds a new node entry with not much more
> than a Node-copyfrom-path: noting where the original is. And if you are
> copying the tag from your working copy with a lot of new material included
> I'd expect that to add the same size to the repo.
Andreas hasn't mentioned whether he's in the scenario which Erik
Hemdal outlined earlier today[1]. It sounds like it is possible for
cheap copies to become expensive in the dumpfile.
1: http://www.nabble.com/RE%3A-How-big-can-a-repository-get--p20792982.html
------------------------------------------------------
http://subversion.tigris.org/ds/viewMessage.do?dsForumId=1065&dsMessageId=804042
To unsubscribe from this discussion, e-mail: [users-unsubscribe_at_subversion.tigris.org].
Received on 2008-12-02 20:06:02 CET