[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Overall bitrot, package reviews and fast(er) unmaintained package removals



2016-04-06 20:19 Wookey:
+++ Ondřej Surý [2016-04-06 00:18 +0200]:
Hey,

while doing some work on PHP transitions, saving courier-imap, finally
packaging seafile since they finally stopped violating GPL, I found a
quite a lot of bitrot in some (mostly leaf) packages. Packages untouched
for years after initial upload, packages with unreachable maintainers,
etc[1].

As a porter I've seen a lot of this too.

Ditto.


What I don't know is whether anyone in the world actually uses this
software or cares about it, and the relative benfits of updating it or
removing it. Am I completely wasting my time fixing up some old package
that doesn't build on arm64?

<diverting a bit from the original discussion>

I think that it would be useful to lower the threshold of packages that
should be available in ports, to alleviate pressure from porters that
have to chase packages which are not in the critical path for anything
else and not useful on their own right.

As a package maintainer, I maintain packages that are not used by anyone
on some of the more fringe architectures (e.g. 3D OpenGL rendering
libraries), so neither maintainers nor porters should be spend time on
that.  (I don't want to restrict artificially the available arches,
though, I don't know exactly what's useful in every arch).

But having to reach 95~98% of coverage for a port is hard unless many of
not-very-useful packages are made to work in all arches, so that
increases the pressure on porters to work on obsolete things when they
don't even know if they are useful and the maintainers are MIA or
unresponsive.

</>

* Some automated check that would mark the package as outdated.

I would certainly find this useful as some kind of metric.  I'm not
sure I agree with all your scoring items in detail, but they are clearly
indicative.

I think that a useful metric would be "there has to be an upload by the
maintainers within this release cycle", so at least once in about 2
years.

Even if the software from the package didn't need to change or be
recompiled, there are other things in the ever-changing Debian processes
(policy standards versions, URLs of Vcs, dh compat levels, default
buildflags, hardening, reproducible builds, distro-wide changes like
/usr/share/doc) that need changing from one stable to the next.  Even if
not, an upload every 2 years is not that much overhead.

(Some of this is metadata that could be elsewhere, but at the moment
almost everything goes in the source package).


Also I would add the requirement that at least the packages that use GCC
they have to recompiled with the most recent toolchain present in the
next stable before they are considered for that release (can be through
binNMUs, if they just work fine).


.. perhaps be more aggressive in
removing software that's no longer useful and just lies in the archive
dormant.

The fact that Debian has a lot of software is a genuine benefit. Just
because stuff is old, does not mean it is no longer useful.

Yes, I am always careful with these issues to not ignore the "long tail
effect" [1], which for me is an important quality of Debian (not
necessarily in stable releases supported for 5 years, but e.g. at least
to have in unstable and ready to use/rescue with a low barrier of
effort).

[1] http://joeyh.name/blog/entry/the_popcon_problem/


The
problem is that we don't really know how to distinguish between
old-and-just-cruft and old-and-still-handy.

I do agree that we could remove more than we currently do, probably with very
little real fallout, and a corresponding increase in overall quality.

Agreed.  I think that the moment we're leaning too much towards the side
of not pruning the cruft for a long time, and some automatic ways to
detect it (as the many measures proposed in the thread) would be a nice
thing to have.


Cheers.
--
Manuel A. Fernandez Montecelo <manuel.montezelo@gmail.com>


Reply to: