Quoting Gard Spreemann (2020-07-14 18:04:13) > Hi! > > Barak A. Pearlmutter <barak@pearlmutter.net> writes: > > > Sometimes I wonder if Debian needs some serious process analysis and > > restructuring. Should a new library version that happens to cross a major > > version boundary really good though the same extra vetting queue that a new > > browser goes through? > > > > tldr: What have we wrought??? > > I'm sorry for picking up on the discussion on this list, but I do not > feel that I've been a DD for long enough to bring a naive question like > this to d-devel on my own – and since you do bring it up I'm tempted to > ask here. > > I've long wondered about the apparent discrepancy in how the NEW queue > works. On one hand, a new source package can spend months in the NEW > queue for very good reasons, such as checking the legality of > redistributing the package and keeping the namespace sane. Yet once that > hurdle is cleared, the same source package can change radically in its > next upload, which essentially renders the legality checking > pointless. But at the same time, a package that merely bumps a soname or > splits a binary package into two (without even changing the contents) > can spend months in NEW all over again. This seems completely arbitrary > to me, and sounds like a way to overwork the ftpmasters for little real > (legal) gain. > > I have a hard time understanding this discrepancy, but maybe someone can > shed some light on it? Almost every time I've found a process in Debian > tedious or slow, I've learned to appreciate the gains that process > brings that I had previously overlooked. With the NEW queue, I am yet to > understand the rationale. Am I missing something? NEW processing is a volunteer task, which a) requires special knowledge and therefore involves a training process, and b) is not as visibly credited as some other contributions, and c) has a real risk of involving grumpy reactions when rejections are needed. Those features probably discourages quite a few from volunteering to get inolved in that particular task in Debian. Occationally threads in debian-devel discuss ideas to "fix" what is perceived from the outside as technical flaws in the NEW processing, but most of us (me included) can only _guess_ if it is really flaws, because we haven't dived in and learned the details of what the NEW processing really is. Hope that helps (not speed up NEW processing, but understand why it is hard thing to improve on). I can suggest to locate and read some of the previous discussions in debian-devel, and think hard (as it sounds like you already do - thanks!) before starting/rehashing yet again. - Jonas -- * Jonas Smedegaard - idealist & Internet-arkitekt * Tlf.: +45 40843136 Website: http://dr.jones.dk/ [x] quote me freely [ ] ask before reusing [ ] keep private
Attachment:
signature.asc
Description: signature