The stable/testing/unstable branches not a solution ?
I have to administer a machine on which several (~10) users use a graphical environment using remote X logins. Since one of the goals of this machine is to promote GNU/Linux in my engineering school, I need top-quality and recent software. Of course, I'm running Debian.
I had to choose between 3 solutions :
- Running Woody with some backports. However, Woody is getting quite old, and only major applications get backported. Furthermore, some backports don't get the Q&A work that would be needed.
- Running Testing. I think very few people run testing, because bugs take a long time to get fixed when detected, since the package has to go through unstable first. Of course, I could backport packages from unstable when needed, but only after discovering the bug.
- Running Unstable. Of course, I'll run into problems from time to time, but they can be fixed most of the time. However, I still fear I'll run into That Big Problem I can't afford on a production machine.
Currently, I'm running Unstable.
However, I remember that Raphael Herzog said something very interesting in his platform for the 2002 DPL election. He introduced the concept of a "working" branch, where packages would go when the maintainer ask for after they would have been tested enough.
I personally don't think Debian should ever release real new versions. Snapshots of a working branch with not so outdated software would be a much better solution. Most of the potential Debian users have broadband internet access now, and don't care about a 7-CD set. Of course, my opinion doesn't have much value, since I'm not even a DD, but I would be very interested in your opinions.