[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: FTP Team -- call for volunteers



On 2020-03-15 18:25, Theodore Y. Ts'o wrote:
The bigger thing which we might be able to do is to require minimal
review if the source package is already in the distribution, but the
main reason why it is in the ftp-master tar pit is because upstream
has bumped the major version number of a shared library, and so there
is a new binary package triggering what appears to be de novo review
by the ftp master team.  I understand there is a super-abundance of
caution which seems to drive all ftp-master team decisions, but
perhaps this could be eased, in the interests of reduce a wait time
of, in some cases greater than a year?

It also drives technical decisions. A much cleaner way from a deployment perspective would be to version kernel packages (and another pet peeve, nvidia packages) for every upload. That way updates and rollbacks can be managed more cleanly. (E.g. old kernel remaining in the boot menu, just like Ubuntu bumps with every upload these days.)

Now we could also fix that using a whitelist approach. But I have not seen much openness to tackling this part of NEW review and I am unsure why. From the public NEW tooling (I don't know dak's side) it pretty clearly does not look like a de novo review, as the diff to the archive is highlighted. Maybe another way would be to split the queue using a weighting function. But I am not aware of public documentation on how the review process is organized currently. Is there any?

(I'm happy to look at potential whitelisting code, but I think last time someone tried a big refactoring and introduction of tests was required of them prior to the contribution - which is a high bar after getting dak to run properly for development purposes first.)

Kind regards
Philipp Kern


Reply to: