Jim Blandy <email@example.com> writes:
> It used to be that compilers would spit out amazing amounts of
> information as they worked --- their versions, target architectures,
> the files being compiled, the sizes and addresses of the functions and
> link modules generated, and so on.
> The Unix folks did a *wonderful* thing when they decided that `cc'
> should produce no output when the compilation went smoothly. We have
> failed to extend this principle to our build processes today because
> 1) we can't tell what directory things are happening in without make
> jibbering constantly at us while it hops from one directory to the
> next, filling our compilation logs with information which is 90%
> useless, and 2) we're worried our compilations will hang. These are
> wimpy reasons, and we all suffer daily because of them.
I don't think anyone is failing to adhere to this "build principle",
actually. It's just a matter of *scale*: what are you trying to
debug, and at what level?
If I want to compile a simple C program, you're right: I certainly
don't want to know what complex things are happening deep inside the
compiler. But if I'm *writing* the compiler, then I definitely do!
Simlilarly, a big project like Subversion has a big build system that
needs debugging in and of itself. So we want to see all the gory
details of what make/cc/ld are doing at all times. This is not noise,
it's important information.
However, I once used to be in charge of building the same working copy
on 7 different flavors of Unix -- simultaneously. I wrote a big
build system to coordinate them all; and in this case, I most
certainly *didn't* want to see the details. Every machine piped it's
build output into a log, and only notified me of failures.
So again, it all depends on what level you're working at.
Received on Sat Oct 21 14:36:12 2006