Q to all candidates: what is the long-term role of traditional Linux distributions?
Debian prides itself on shipping large quantities of free software with
a strong level of stability within a release. A huge number of users
around the world rely on Debian as a solid base for their infrastructure
and derivative works, and our packaging policy makes it easier for us to
ensure that security updates hit all users rather than a subset.
But upstream development is increasingly diverging from our approach.
Many new software ecosystems are based on external code repositories
rather than relying on the distribution, and in several languages it's
expected that a project directly include its dependencies rather than
relying on external availability. A world in which users are more
concerned about immediate functionality rather than long-term interface
stability means there's an increasing amount of free software that's
somewhere between difficult and impossible to ship in Debian. Efforts
like Snap and Flatpak are even making this the case for desktop
applications, providing an alternative approach for users to obtain
auto-updated software without relying on Debian.
Given these upstream shifts, is attempting to package as much software
as possible something that actually benefits Debian and our users, or is
it something that brings us a duplication of effort? If we spent time on
building tooling to automatically identify that (say) installed Go
applications that contain dependencies with security vulnerabilities and
alert users, would that be time better spent than independendly
packaging and maintaining those dependencies ourselves? Are our current
priorities the best way to serve the free software community over the
next 10 years? Would we be better off focusing Debian as a high-quality
base for users who then largely consume software from other sources?
--
Matthew Garrett | mjg59@srcf.ucam.org
Reply to: