[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: RFC: Rules for distro-friendly packages

* Jesús M. Navarro <jesus.navarro@undominio.net> schrieb:

> Think of the most probable environment where somebody goes with the hassle 
> of "compiling new package into old RHEL 2".  Do you think such a chore is 
> taken out of fun?  Or is it an environment where an overworked sysadmin at 
> charge of a lot of disparaged machines is put into that need because out of 
> his reach constrains? 

The sysadmins should run the build through a dedicated build system
which generates packages for their target(s). I'm currently working
on providing a generic platform for that, based on Briegel [1], but
that's another story.

BTW: if you want to build new software directly on old systems, you'll
sooner or later run into many other kinds of problems. For example 
think of bugs in older libc's or compilers which then would have to 
worked around in newer packages. That's nasty, nobody really likes that.
The answer is clear: use a dedidacted build system (eg. via sysroot'ed
crosscompiler), which produces the right code/packages for that old
systems. This way, eg. you don't have to care about broken or missing
libc functions on the target - an properly adapted toolchain takes
care of this.

> > On the other hand, there are often cases where you *NEED* to rebuild
> > the whole configure stuff.
> Truly.  As there are cases when I have to go through the Makefile or
> even the source code to patch them so they run in my systems. But,
> please, do not make those cases more usual that *strictly* needed.

In last decade I've encountered so many of such cases, that I decided
to always regenerate these file and catch up problems at the earliest
point, at the source. So, eg., if regeneration fails, I first fix that.
Yes, this might take a bit more time to get some particular package
building and running, but in the longer run, it always had paid out.

> Because as much as any other software, autotools tend not to be forwards 
> compatible: as long as you use a feature from x+7.y.z it will probably fail 
> when runing older x.y.z.

Gentoo has an interesting handling for that: they support packages
requesting specific autoconf versions. I'm normally fixing it at
the source, but fixing packages to work w/ recent autoconf.

> You seem to forget that in the context of this discussion "arbitrary users" 
> are sysadmins on their duty.  They are perfectly expected to be recompiling 
> software on stable/production systems.  Heck, it's even there, on the FHS:

In this case, the admins are also put into the role of package
maintainer (of their own, 1-system distro) and QM engineer.
They should have the neccessary skills to do that, or leave it.

> > As soon as you're attempting that, 
> > you're stepping into the package maintainer or developer role, and
> > then you should *know* what you're doing (or at least learn it).
> Ah... those youngsters... package maintainers are a very convenient and most 
> praised *proxies* for the work of the sysadmin.

They're much more than this, they're also QM engineers that make sure,
individual packages play well in the ecosystem of their distro.
(at least they should do so).

> Of course "proper packages" are "better" but in many situations a make
> install onto /usr/local is good enough and "better" may change its meaning
> when dealing with half a dozen different unix-like systems.

It might be dangerous: adding some libraries to standard search pathes
can change behaviour of installed packages massively.

> > That's one of the fundamental conceptional flaws.
> It might be the case, but then it's a "fundamental conceptional flaw" 
> from the very developers of the tool. 

Indeed, it is. This comes from autoconf's long history of hacked-up
macros, which has grown over years and years. Essentially it's an
large collection of hacks.

A really clean concept would be very different, it would use declarative
approaches, which allow central target-specific configuration, would
be written in easily understandable code, etc, etc.
If you'd like to work on such an system, be my guest, lets talk about that.

> > Users should use the finished packages provided by their distro.
> As long as possible.

Always. If some package is missing, add it to the distro (maybe in a
local repo), using it's packaging toolchain. So, eg. for Debian, it
means, create Debian packages, do the proper QM works, and then install
the .dpkg.

> > > As upstream, I  care about Debian and cross-compiling  but I also should
> > > care about  people wanting to  compile my software  on old RHEL 2  or on
> > > Debian Etch (an  old enough platform to require some  runtime test in my
> > > case). None of those platforms have a recent enough autotools to rebuild
> > > configure, BTW.
> >
> > Then they should be updated.
> Yeah, well... that's not so constructive. 

No, it is very constructive, since it's a clean way. Just package recent
autotools to RHEL 2. Shouldn't be that hard and solves the problem earlier
in the chain (and more generically, for more than just one package).

> You can bet no sysadmin maintains and ages old system out of lazyness.  That 
> those systems *should* be updated won't magically make that they *can* be 
> updated.

I didnt say that the production system should be updated - I'm
talking about the build environment.

 Enrico Weigelt, metux IT service -- http://www.metux.de/

 phone:  +49 36207 519931  email: weigelt@metux.de
 mobile: +49 151 27565287  icq:   210169427         skype: nekrad666
 Embedded-Linux / Portierung / Opensource-QM / Verteilte Systeme

Reply to: