[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Another "testing" vs "unstable" question



On 2004-06-23, John Summerfield penned:
>
> I have been to www.apt-get.org and I got Mozilla from here, pine from
> there,  KDE from somewhere else, Xfree from another... Do you get the
> picture?

Well, just to be pedantic, you wouldn't find pine anywhere in debian
because of its licensing terms.

> A coordinated, official system of official backports would be a fine
> thing, and the workforce to do it is already there - they're the
> people making these unofficial  backports.

Yes, but there's no way to test those backports thoroughly enough to
match the amount of testing that went into stable in the first place.

> Until Red Hat Linux 8.0, Red Hat had two cycles of releases:
>
> Major numbers, 5.x, 6.x, 7.x maintained binary compatibility. Those
> came out with about the same frequencies as Debian releases.

And the dot-oh releases were well known to be buggy piles of crap.
There was always some nasty gotcha lurking in the system.  I don't know
why that was the case, but it definitely held true from at least 4 to 6,
maybe 7.  Somewhere in there I stopped having to care because I switched
to Debian.

> Then there were the minor releases, x.[0-3] coming out at about
> six-monthly intervals. One could take a package from x.2 and install
> it with minimal bother on x.0 or x.1, with every expectation of not
> breaking anything.
>
> It's a model Debian would do well to look at and see how it can adapt
> it, adopt it. Using this model, Sarge would be 4.0, not 3.1 because it
> breaks binary compatibility (new gcc, new glibc).

It sounds like a lot more work for the developers.  RedHat had
commercial customers to support their developers.  How would you suggest
Debian manage this?

-- 
monique



Reply to: