[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

The NEW queue process (Was: Re: Bug#964983: New Upstream Version)



Hi!

Barak A. Pearlmutter <barak@pearlmutter.net> writes:

> Sometimes I wonder if Debian needs some serious process analysis and
> restructuring. Should a new library version that happens to cross a major
> version boundary really good though the same extra vetting queue that a new
> browser goes through?
>
> tldr: What have we wrought???

I'm sorry for picking up on the discussion on this list, but I do not
feel that I've been a DD for long enough to bring a naive question like
this to d-devel on my own – and since you do bring it up I'm tempted to
ask here.

I've long wondered about the apparent discrepancy in how the NEW queue
works. On one hand, a new source package can spend months in the NEW
queue for very good reasons, such as checking the legality of
redistributing the package and keeping the namespace sane. Yet once that
hurdle is cleared, the same source package can change radically in its
next upload, which essentially renders the legality checking
pointless. But at the same time, a package that merely bumps a soname or
splits a binary package into two (without even changing the contents)
can spend months in NEW all over again. This seems completely arbitrary
to me, and sounds like a way to overwork the ftpmasters for little real
(legal) gain.

I have a hard time understanding this discrepancy, but maybe someone can
shed some light on it? Almost every time I've found a process in Debian
tedious or slow, I've learned to appreciate the gains that process
brings that I had previously overlooked. With the NEW queue, I am yet to
understand the rationale. Am I missing something?


 Best,
 Gard
 


Reply to: