[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Bits from the FTPMaster meeting



Steve Langasek <vorlon@debian.org> writes:

> On Mon, Nov 16, 2009 at 08:38:15AM +0100, Goswin von Brederlow wrote:
>> > I'm not asserting that this problem is *not* significant, I simply don't
>> > know - and am interested in knowing if anyone has more data on this beyond
>> > some four-year-old anecdotes.  Certainly, Debian with its wider range of
>> > ports is more likely to run into problems because of this than Ubuntu, and
>> > so will need to be fairly cautious.
>
>> I don't think the number of ports will have any meaning here. If the
>> package is too broken to build/work on the maintainers architecture it
>> will most likely be broken on all archs. On the other hand if it works
>> on the maintainers architecture then testing or no testing makes no
>> difference to the other ports.
>
>> It seems to me the only port that MIGHT suffer quality issues is the
>> one the maintainer uses. Meaning i386 or amd64 usualy and Ubuntu
>> already has experience there.
>
> On Mon, Nov 16, 2009 at 06:24:42PM +1100, Robert Collins wrote:
>> On Sun, 2009-11-15 at 19:29 -0600, Steve Langasek wrote:
>
>> > I'm not asserting that this problem is *not* significant, I simply don't
>> > know - and am interested in knowing if anyone has more data on this beyond
>> > some four-year-old anecdotes.  Certainly, Debian with its wider range of
>> > ports is more likely to run into problems because of this than Ubuntu, and
>> > so will need to be fairly cautious.
>
>> I'd have assumed that ports will have no effect on this: Debian only
>> uploads one binary arch (from the maintainer) anyway :- only builds on
>> that arch will be directly affected except in the case of a build
>> failure that the maintainer could have caught locally.
>
> I thought the nature of the problem was clear, but to be explicit:
> requiring binary uploads ensures that the package has been build-tested
> *somewhere* prior to upload, and avoids clogging up the buildds with
> preventable failures (some of which will happen only at the end of the
> build, which may tie up the buildd for quite a long time).  The larger
> number of ports compared to Ubuntu has the effect that the ports with the
> lowest capacity are /more likely/ to run into problems as a result of such
> waste, and as Debian only advances as fast as the slowest supported port,
> this holds up the entire distribution.

Which assumes the slower ports are neigther idle nor backloged but
have just the right amount of load that they will actually build the
buggy source before the maintainer uploads the next version.

The only thing that shows is that the current static build order of
packages (A is always build before B no matter how new A and how old B
is) is the only problem here. Factor in the time a source was uploaded
and there won't be starvation of sources by buggy sources.

And if buggy sources are still a problem after that implement a karma
systems. Every time a source fails the package gets a malus, every
time it succeeds it gets a bonus and you factor that into the
priority. That way stupid maintainers will get their packages less
likely build and will have to wait for idle times.

If ports still have problems with buggy sources after that make
sources wait for some fast architecture to build them successfully
first before trying it. Block sources that already failed on
i386/amd64 completly. Or are you telling me the amd64 is too slow to
finish building before a backloged arm buildd even tries?

All of that assumes this even is a problem in the first place, but
lets say it is. I don't think that requiring binary uploads ensures
reliably that sources will build. Experience has shown some really
broken uploads and tons of fluke breakages anyway. Those maintainers
that do care will still test their packages dutifully. Those that
already don't will keep uploading packages build against e.g. stable
or experimental even more. After all, the debs are thrown away. Why
bother rebuilding a source in a clean chroot if it did build in the
normal system? Or build debs with -nc during developement and source
once for the release and merge the changes files. Or put some dummy
deb into the changes file to trick the DAK. or or or. There are so
many ways a lazy maintainer can get around the check that it might
just end up only hurting the good guys.


Well, what I'm saying is that I'm not convinced this measure will have
any big effect either way. The good guys will still do good uploads,
the bad guys will still manage do do bad ones and unavoidable screwups
will still happen. Lets concentrate on the important point. i386/amd64
will get clean binary packages build on a clean buildd. I think that
will improve quality much more than anything else.

MfG
        Goswin


Reply to: