[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Architecture independent binaries and building from source

Shaun Jackman wrote:

The build system can function much like automake does. Makefile.in is
not usually regenerated from Makefile.am. If Makefile.in is removed it
will be regenerated. Likewise, the build system could typically
redistribute upstream's derivative form. If the security team finds it
necessary to patch the source, simply removing upstream's binary will
cause it to be rebuilt. This allows both redistribution of a pristine
upstream binary as well as potential modification by the security

With Java, upstream binaries often enough plain suck. I've seen many jar files that happily violated the VM specification because they were created with broken compilers, and only passed through on Sun's JVM due to bugs in Sun's implementation. We had to turn off some of the warnings about broken Jar files in kaffe since normal users found them too confusing.

Then there is the class file format versioning issue: depending on which version of which Java compiler you use, the default class file format version used to generate the bytecode changes. In the C world, it would be like having undocumented random changes to the ELF format once a year when Red Hat releases their next version of RHEL. Older Sun VMs will not run code compiled with newer Sun VMs, for example.

Then there is the tendency of some projects to encode class path dependencies in their JAR files, which blows up as soon as you move around platforms and the directory layout assumptions no longer hold.

Then there is the tendency of some Java projects to ship with a bazillion of third party binary blobs, without any indication which versions those binary blobs are supposed to represent. And don't get me started on the funny people who smack some random, unversioned, noone-knows-how compiled JAR from some random CVS checkout into their set of third party blobs.

Then there are those upstream Java packages that are impossible to build on anything but some specific release of Sun's VM, because some 'developers' decided to use sun.* classes instead of a documented API.

And so on. See http://java.debian.net/index.php/What%20is%20required%20from%20upstream for a lot of nice remarks on the 'what consititues a good Java upstream release' from a packager's point of view.

From experience of getting lots of broken Java code to run on free runtimes, I can only recommend building from source on free runtimes to weed out the unmaintainable[1] stuff before it pollutes debian-main. It can't go into main without being buildable with DFSG free tools anyway, so there is little point in not ensuring that it really builds with DFSG free tools, and that we are not stuck fixing gcj, kaffe or SableVM first when we need a security fix quickly.

On the discussion of configure.in vs. configure: I don't think the 'we don't do it there either' argument has a lot of merit. It would be cool if we did, judging by README.debian in autotools-dev:

"We do not want whatever possibly broken crap upstream has for gettext, autoconf and friends screwing up Debian automated builds. The gettext, libtool and autoconf Debian maintainers take great pain to make sure their packages are up-to-date, bug-free and work sanely on all Debian-supported architectures. All that work goes down the drain if you leave some old config.sub from an unknown variant of some weird distribution of 3 years ago just because your upstream happens to like it."

Equally well, we should not want 'whatever possibly broken crap upstream has' for java, javac, ant, xerces and friends screwing up the jars. If jars are getting screwed, they should get screwed in a predictable, repeatable fashion on the buildds so that we can have a chance to fix them.

dalibor topic

[1] For example when you need to fix security bugs

Reply to: