jrepenning@collab.net:
> If variance adjusting really turns an ugly delta into an
> unsignaled lie, that would be a major offense; this is, I
> think, a restatement of Tom's concern.
Heh. We speak similar languages. In a very general sense, a common
design failure of programs is to try to hide what's going on at the
low level, and make it look like something subtlely (but deeply)
different is going on at the user interface level. That's a design
that "lies" and it's always a bad idea (but sometimes sells for a
while). Users eventually bump into the edge cases of the lie, and
then they get into trouble.
The alternative to lying is to design user interfaces that _explain_
and help manage the implications of the low level.
> My question would be, "how often do these conflicting fixes
> in the same area arise?" If often, we should worry; if
> rarely, perhaps not.
The in-range rules are particularly problematic and I would be
prepared to bet my last $110 that they will cause problems often.
The function call examples in my last mail illustrate. They
effectively say "Discard or ignore your changes in this code -- where
the same lines haven't changed in the changeset, ignore your changes;
where the same lines have changed, discard your changes." That's not
never the right thing, but it's often not.
> But this is exactly and solely the question that has divided
> "change-based" advocates from "version-based" advocates for
> fifty years;
?!? I don't think it's "exactly and solely" any such thing ... or
even that that question really dominates much literature ... but
you've made me curious what you're looking at from 1953....
-t
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Thu Apr 10 03:07:10 2003