Building against testing [was Re: "Recompile with libc6 from testing" <-- doesn't help i think!]
Erich Schubert wrote:
The privoxy changelog says:
* Recompile with libc6 from testing (instead of unstable).
I guess this was to get it into testing faster.
the buildd will not for other archs with the old libc6 i think.
So this will not help.
Please don't do "Recompile with libc6 from testing"...
Testing isn't too useful while we are not heading for a freeze, so IMHO
it's "stay at stable or dare unstable" time...
Sorry for arriving to the party late, but I take an extreme exception to
this. Please compile *all* packages against testing if possible. The
whole idea behind the three tier architecture - stable, testing,
unstable, is to make freezes go faster and let more adventurous users
get the latest software. If the latest software is stuck in unstable
because of a bug in a non-required dependency (ie, a dependency is
updated to to fix a bug that the package does not depend on) nothing has
Ideally, it seems to be, developers would be running unstable but
compiling against testing. Testing is, afterall, the where we want
development to be, because it's the next stable. We want unstable to
sheild our users from the odd massive breakage.
We've already heard complaints that the 3-tier system is failing because
so many people are running testing instead of unstable, as in the
past. It seems to me that the reason behind that is that so many
packages in unstable get queued up on some silly dependency and then the
whole thing floods into testing en masse.
Testing is not stable by definition, and all good practicies will tell
you that when you are testing, you change one thing at a time. Nobody
is trying to make testing stable (until a freeze, of course) and more up
to date testing is, the less pressure there is when a freeze does come
around. The long freezes and then releases with already old software
can only be sped up if testing is kept as up to date as possible, and
compiling against testing is the means to that end.
Now, I'm not a dd and I'm sure I'll get a slew of responses saying that
I'm speaking out of my rearend because the problem was boot floppies or
security or something else -- all of which are true -- but the problems
of long stabilization freezes came from the huge rift between unstable
cum frozen and stable, and arbitrarily forcing the divide by not
compiling against what is in testing doesn't help anything.
Maybe the problem lies more in the buildd system (which I am also not so
familiar with). Debian uploads source packages, and these get rebuilt
by the buildds, correct? If so, perhaps it would be better to make them
more intelligent -- if they can buildd cleanly against testing, they
should do that first. If there are problems, they try unstable. Either
way, the package goes into unstable, but if it's compiled against
testing, there are obviously no dependencies needed from unstable so
there's no need to be held up by a dependency on a new library rev
(which may be incremented weekly, keeping the otherwise working package
in unstable in perpetuity).