[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: User-contrib, up-to-date stable



Quoting Manoj Srivastava (srivasta@datasync.com):
> 	The way it is set up, people may just download sources from
>  hamm, if they are sure the packages would live happily on bo
>  as well, and try ./debian/rules binary. If that works (producing a
>  deb file), and they just feel lucky ;-), they can upgrade the package
>  piecemeal.

What's the advantage of getting and compiling a hamm package over just
grabbing the original tarball and compiling? Most of the stuff I see
nowadays has a ./configure and a relatively simple make procedure...

Quoting Hamish Moffatt (hmoffatt@mail.com):
> On Tue, Oct 21, 1997 at 08:00:46PM -0400, Dirk Eddelbuettel wrote:
> > That is nonsense, and not very polite to boot. For well over two years, I
> > 'risked', among other things, my PhD dissertation by using 'unstable' as my
> > sole computing platform --- which proved to be an extremely stable, reliable
> > and productive computing environment to get not only that dissertation
> > done. There is nothing wrong with using "unstable" if you want to be cutting
> > edge. You risk what you called 'hazzles' and 'annoyances' by using *cutting
> > edge upstream releases* anyway. 
>
> Indeed. Why is it so unacceptable to be running old upstream releases
> anyway? When packages go stable in debian, it means that the people
> who tested the release have verified, as best they can, that the software
> works correctly, by not filing bug reports. Is there really any
> functionality added in the latest upstream releases that is absolutely
> required on a production system, especially when (as Dirk points out)
> the upstream release might not be stable? Paul cites the example of

It's one thing to risk your dissertation. It's quite another to risk the
dissertations of everyone on a campus network (for example.) The rules
for personal machines are different than the rules for multiuser
systems. And yes, there's some risk in using "cutting edge upstream
releases," but at some point you have to upgrade, either for features or
security. Some examples: I stopped using the debian apache package for
our web server back around the release of 1.2. I needed some of the new
features in 1.2, and there were a couple of releases that fixed some
security problems. The debian packages just didn't come fast enough. A
similar situation exists for samba (which was mentioned earlier):
there's just too much of a time lag between the time the upstream
package is released and the time the debian package comes out. The
definition of "too much lag" depends on what you're using the system
for. I'm pretty sure, though, that there are more than a few sites using
samba that would like to get 17p4 without having to move to hamm. It's a
relatively minor patch that fixes a _very big_ problem, not a cosmetic
upgrade. 
 
> In short, can you really have a current system labelled "stable"? It may
> be stable, but it needs time to prove itself ...

How much testing decides whether a package is stable? If there's a large
amount of upstream testing, and it will run on the debian system, how
much of a delay is justified? In other words, where's the break even
point between the benefit of fixing _known_ bugs and the risk of
introducing _possible_ bugs?

-- 
Michael Stone, Sysadmin, ITRI     PGP: key 1024/76556F95 from mit keyserver,
mstone@itri.loyola.edu            finger, or email with "Subject: get pgp key" 


--
TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
debian-devel-request@lists.debian.org . 
Trouble?  e-mail to templin@bucknell.edu .


Reply to: