[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: What about a non-free compiler? (Re: new port: debian-win32. when ?)



Eray Ozkural <erayo@cs.bilkent.edu.tr> wrote:
>Colin Watson wrote:
>> We shouldn't be distributing package binaries with Debian that are
>> compiled with some commercial tool not available to others. If both I
>> and the package distributor have current versions of the relevant
>> development packages, I must be able to compile the sources myself and
>> arrive at exactly the same result. Anything else makes it difficult,
>> sometimes impossible, to adequately test fixes that I may want to
>> contribute back to the package, and I think this is unacceptable.
>
>I guess you're referring to subtleties caused by the compiler
>implementation, slightly different binary file format handling,
>and the mysterious troubles that might be caused.

Kind of, well, at least for the first bit. gcc and your hypothetical
compiler generate different code, by definition, and there there's more
than room for differences in behaviour. Subtleties is certainly the
right word, though.

>For instance, gcc might output A for a, while the proprietary compiler
>outputs A', while A and A' must carry the same semantics A doesn't
>compile due to an error in library linkage but A' works perfectly. The
>gcc user has to figure out why it didn't work. But I was suggesting
>that we assume the proprietary compiler is a drop-in replacement for
>gcc to produce Linux ELF binaries, etc.

I think you're assuming an ideal world, but this is the nature of
hypothetical arguments, I guess. :)

I wasn't really thinking of library incompatibilities, really, more of
code optimizations accidentally breaking the actual compilation. As an
example, how about the way in which the egcs branch of gcc can't be used
to compile 2.0 kernels, because gcc-2.7.2 accepted some slightly
non-standard-conformant code which egcs didn't? The problems caused
across completely different compilers are likely to be greater again
then across different versions of the same compiler, I think - the
number of problems this will cause will be small in number, granted, but
when they do occur they'll be extremely frustrating, particularly for
newbies.

I guess what I'm trying to say that, in all but a minority of cases, we
should be optimizing for compatibility and reproducibility of problems,
rather than optimizing for speed. High-performance scientific computing
could perhaps be an exception to this, as you say below (I'm
inexperienced in this field, but in a Beowulf cluster, for example,
wouldn't a smallish constant-time factor improvement in the compiled
code be less important than good parallelized algorithms and fast
networking?).

If you're asserting that the proprietary compiler is exactly as correct
as gcc, then I start to run out of pragmatic arguments and have to fall
back on ideological ones, I suppose. :) I think Debian, which values so
highly its commitment to free software, would lose a certain amount of
its reputation in this regard if packages in main started to be built
with non-free compilers. We might look a bit silly, perhaps ...

>And compilation with gcc is also tested, and the new compiled binary is
>optional. That is primary packages are done on gcc, but optimized
>binaries are created with the proprietary compiler. After all, I was
>only fantasizing, I'm trying to find out whether this makes sense.

Ah - are we talking about something like a separate 'i386-optimized'
architecture here? I'd find that a lot more acceptable, although it
would increase the load on maintainers a bit.

>The case that another compiler can build your package all right, but
>gcc cannot. I also think this would be unacceptable. But again,
>I'd like to remind that many packages are portable themselves
>and are being developed on compilers different than gcc.

Yup, though this has been a maintenance headache in the past (and still
is, sometimes, though tools like autoconf alleviate much of the stress
involved).

>I would like to make my software into Debian packages, and I think the
>right way would be to offer the binaries with best performance. (Yes, I
>will put them under GPL)

I think that, if you can't build the binaries adequately with a free
compiler, then the package should go into contrib; then you can build it
with whatever compiler you like. I feel that compiling binary packages
with a non-free compiler, even if they can be compiled fine with gcc or
whatever, at least violates the requirements for main in spirit:

  In addition, the packages in "main" ... must not require a package
  outside of "main" for compilation or execution ...

    <http://www.debian.org/doc/debian-policy/ch2.html#s2.1.2>

contrib is defined, in part, as:

  Examples of packages which would be included in "contrib" are ... free
  packages which require "contrib", "non-free", or "non-US" packages or
  packages which are not in our archive at all for compilation or
  execution ...

    <http://www.debian.org/doc/debian-policy/ch2.html#s2.1.3>

I think this is really closer to what you want to do, and if you as the
package maintainer were convinced that the compiler you were using was
reliable then I'd have no objections to the package going into contrib.

-- 
Colin Watson                                           [cjw44@cam.ac.uk]


Reply to: