> > I'm having trouble seeing the "double the disk space" as a
> > significant problem.
> If what has been said before doesn't shopw you the problem I am not
> sure there is anything I can say that will convince you there is
> one. I will just point out that YOUR usage is significantly
> different to MY usage. I don't think a tool should force your
> usage model on me, and more than it should force my usage model
> on you. SVN is trying to penetrate a market that is dominated
> by a lightweight tool. Its light on everything except time,
> as some operations take CVS a long time to perform. As things
> currently stand, it seems as if SVN is much "heavier" than
> cvs INCLUDING on time, except in the case of cheap copies. I get
> the sense that because those are fast people assume that the
> rest of it is. Its not.
> > Are you in an environment in which you develop over
> > the LAN but the extra disk space is a significant expense?
> I wouldn't say a "significant" expense but it is certainly
> and avoidable one. In todays economic climate where a lot of
> companies are surviving by the skin of their teeth, it is hard
> to justify buying $600 72G SCSI drives when we already have
> perfectly good workstations that can cope. Moving to a tool
> that would require is to upgrade every deevelopers machine
> just because someone thought the ability to do local diffs
> was a justification for double disk usage is really not on
> in the real world.
> > are? It sounds to me like this is a hypothetical situation, not one
> Not at all. Most of our current developers have 18G SCSI hard
> drives. With those, you have just enough room to do a full get,
> build and PI run. That's if we just build OSR5. If we include
> the Java, UnixWare and other open source builds, as SOME of us
> do, then a 36G drive just fits, with about 4G to spare.
> I must be honest, I find it quite hard to believe that people
> can defend a system where there is a 100% decrease in
> efficiency, that in many cases will NEVER be used (refer back to
> my original post about build machines). That's what the current
> text-base penalty is ... it is a *100 percent* increase in
> resource load.
I question the 100% penalty, although I see some increase. How much
of these sizes is due to the sources (for which there is a 100%
penalty) and how much of these sizes is due to binaries (which presumably
wouldn't be checked in? Or are the binaries checked in for some reason?
- Scott Lenser
To unsubscribe, e-mail: firstname.lastname@example.org
For additional commands, e-mail: email@example.com
Received on Tue Dec 17 07:20:23 2002