Re: Lots and lots of tiny node.js packages
Ian Jackson <email@example.com> writes:
> Our systems are not really set up for so many packages. They were
> designed with the assumption that a package would represent a
> substantial amount of upstream work, so that the Debian overhead is
> modest by comparison.
> Can you explain why you don't aggregate these into bigger packages,
> for use in Debian ?
There have been various efforts to aggregate tiny packages together into
larger packages in the past. I'm familiar with some of those efforts on
the Perl team. My impression is that, in every case where upstream was
not doing the same aggregation, the result has been somewhere between
uncomfortably awkward and a disaster.
While it's true that our infrastructure doesn't scale ideally with lots of
small packages, it's also true that it is HORRIBLE at handling packages
that correspond to multiple independent upstream releases. There is a bit
of multi-tarball support, but it's mostly useful for single packages that
upstream releases in separate tarballs for some bizarre reason of their
own, but still mostly in lockstep. It doesn't deal at all well with lots
of entirely independent packages with their own version numbers and their
own release schedules.
Furthermore, this is horribly confusing for our users, who can't find the
Debian package that corresponds to the thing they want easily, get
confused by the mapping, can't find the version number of the thing they
care about, and otherwise get lots in this aggregation. Understandably,
since it's weird and unusual.
If upstream themselves aggregates, then this works well. (See, for
instance, TeX Live, which is basically an upstream aggregation of
independently-released packages.) That gets its own version number and
its own unique existence and someone else is doing integration and release
management upstream of us and we can reuse some of their work.
But if we were going to do that, I think we would almost have to run a
separate TeX-Live-style project as an artificial upstream for Debian
packaging and do all the work that the TeX Live folks do to assemble that
distribution. And that's even *more* work than the Node packagers are
already putting in, of somewhat dubious benefit (since it would only be to
reduce package metadata).
While it's true there are worries about scaling the archive to lots more
packages, I personally think the right path forward would be to fix things
that break in the archive to cope with this rather than to try to do
artificial bundling. I think people seriously underestimate just how hard
the artificial bundling is both technically and in all of the user
interface issues it creates for our users.
Russ Allbery (firstname.lastname@example.org) <http://www.eyrie.org/~eagle/>