[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Overall bitrot, package reviews and fast(er) unmaintained package removals



On Wed, 06 Apr 2016 17:16:18 +0100, Neil Williams wrote:

> On Wed, 6 Apr 2016 15:27:48 +0000 (UTC)
> Felipe Sateler <fsateler@debian.org> wrote:
> 
>> On Wed, 06 Apr 2016 00:18:10 +0200, Ondřej Surý wrote:
>> 
>> >  - no upload in a long time
>> 
>> s/upload/maintainer upload/
> 
> One key part of the metric would be >2 NMUs without maintainer upload.

I think this is a good idea.

> 
> No maintainer upload alone is insufficient - uploading every package
> once a year "just because" does not help anyone. It's another reason why
> simply having an outdated Standards-Version is also insufficient.

Also agree.

>> >  - other indicators
>> 
>> - Is maintained by the QA group (for longer than X time?)
>> - Is orphaned (for longer than X time?)
>> - Is RFA (for longer than X time? Or maybe it should auto-move to
>>   orphaned)
>> 
>> Essentially, if nobody steps up to maintain the packages, then they
>> should go.
>> 
>> - Maintainer does not respond to bug reports in a timely (eg, 1.5
>> months calculated per package).
>> 
>> I think that maintainer responsiveness should be the key metric, not
>> up- to-dateness (ie, the maintainer may be holding back for good
>> reasons, but those reasons should be explained).
> 
> That could lead to a lot of ping messages in bug reports which might not
> be that useful. It could also lead to maintainers closing bugs which may
> have previously been left open as wontfix or wishlist. The severity of
> the bug may need to be considered.

As always, the devil is in the details. I agree that severity should be 
considered. But I'm mostly thinking about new bugs. Bug reports without 
any maintainer response at all are way more common than they should be.

> How do we assess responsiveness on those packages which have 0 bugs?

Good question. But let's not make the perfect be the enemy of the good.

>
> This does need to be about the package quality, not the maintainer. If
> there is a stack of bugs with no response, it is very different to a
> package with a couple of wishlist issues.

As agree above, severity should be part of the metric.

> So more than just
> responsiveness, it needs to take account of the number and severity of
> the bugs to which there has not been a response.

The number should be in relation to the total number of bugs the package 
has. A package with a single bug report that is unanswered should have a 
bad score, a package with several bug reports unanswered out of hundreds, 
a better one.

> There may also need to
> be some protection from the implications of severity-ping-pong. Overall,
> I think this is an unreliable metric and should not be used.

The failure mode, as you describe it, is to be too lax, and will be 
easily trickable. I somewhat agree. But I don't find it an argument 
against it. After all, the idea is to help discover places that need 
attention, not make debian fit on a single cd again ;)

> 
>> This should also help detecting teams that have effectively become
>> empty.
> 
> That is not the same as low quality packages.

Unmaintained packages are likely to become low quality as time passes. No 
need to wait for that to happen.

> 
> Packages with NMUs not resolved by the maintainer is a much better
> metric. The bugs are closed, so responsiveness would not be counted, but
> the package is still low quality.

I'm not sure I get this. Do you mean maintainer uploads that discard the 
NMU part? That should be a red flag as well.

>  
>> > * Package marked as "outdated" would:
>> >  a) not be able to enter "stable"
>> >  b) not be able to enter "testing"
>> >  c) would be removed from "unstable"
>> 
>> Adding to the testing autoremoval queue would be a great start.
> 
> That also ensures that dependencies are considered.
> 
> The full list of identified packages will need some form of marker
> because then tracker could indicate this in the same way as it does for
> "your package depends on a package which needs a new maintainer" for
> orphaned packages. (Maybe the first step for this process *is* to
> forcibly orphan the package?)
> 
> The individual metrics need to be aggregated to a score but fine tuning
> that score algorithm is more work than most people want to do on
> packages which are already uninteresting.
> 
> What has happened in the past is that a BSP close to a release has had a
> reason to look at a particular set of packages and removed the whole lot
> in one operation.

This is a slightly different topic than the reason I answered in the 
first place: changes that affect a wider number of packages usually take 
forever and involve large amounts of NMUs.


> It's a scatter-gun approach but getting agreement on
> the algorithm could take forever.

Unfortunately, this is true.

> 
> There needs to be something which makes these uninteresting packages
> relevant to something important - beyond them simply being low quality.

I'm not sure what you mean by 'need'. In my ideal world, this process 
would be largely automatic, so that human effort can be better spent in 
more productive areas (like fixing bugs). In other words, low quality and 
unmaintained packages should cease to be a burden on others.

-- 
Saludos,
Felipe Sateler


Reply to: