Re: testing packages at build
On 09-Oct-03, 13:00 (CDT), Branden Robinson <firstname.lastname@example.org> wrote:
> On Thu, Oct 09, 2003 at 08:24:43AM -0500, Steve Greenland wrote:
> > No. While they certainly do exist, >99% of the time, if code works at
> > -O0 but not at -O2, then the code is broken.
> I find this difficult to swallow given my own experiences with XFree86
> when it first met GCC 3.3.
Well, that's a pretty extreme case - an ancient, giant codebase and new
compiler release with new optimization features. And if XFree86 really
still requires --traditional with the pre-processor, it's likely to
violate (or perhaps more accurately, exceed) the standard in a variety
of ways. (But maybe not, I've not really looked at it...)
And in any case, do you think that it would be a good idea to just
automatically recompile Xfree86 with '-O0' if the binaries failed tests?
(Rhetorical question, I presume.)
Over many years, with a large variety of languages, OSs, and compilers,
I found that most "optimization bugs" turned out to be coding errors.
GCC may, in fact, be more likely to have optimization bugs than, say,
the old DEC Fortran compiler. But when something breaks when the
optimizer is turned on, I look at the code first: it's certainly the way
 Well, there were certainly a few compilers whose optimizer *was*
broken, in which case we soon learned to not turn it on except for
The irony is that Bill Gates claims to be making a stable operating
system and Linus Torvalds claims to be trying to take over the
world. -- seen on the net