[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Hashsum mismatch prevention strategies



David Kalnischkies <kalnischkies+debian@gmail.com> writes:

> On Sun, May 20, 2012 at 9:17 PM, Raphael Geissert <geissert@debian.org> wrote:
>> Raphael Geissert wrote:
>>> Goswin von Brederlow wrote:
>>>> But I'm not convinced the number of files to download is actualy the
>>>> limiting factor:
>>>
>>> It isn't, but it adds overhead.
>>
>> And there wouldn't be as much benefit if http pipelining is really going to
>> be disabled.
>>
>> (why doesn't the ubuntu image include an additional .conf file to disable it
>> for the aws mirrors? oh well...)
>...
> Also, currently pdiffs aren't downloaded in a pipelineable fashion, so this
> isn't even a regression in this regard, but would be an added improvement
> in case we come to a point in which pipeline is enabled by default again.

The current apt behaviour with pdiffs is basically completly unusable.
So simply forget about that. Talking about not having a regression to a
broken method is pointless. We were talking about a hypotetical rewrite
that would pipeline all needed pdiffs instead of the fetch, apply,
check, repeat approach.

That said, even without pipelining downloading a bunch of files is not
that slow. True there is some added delay for sending each header and
waiting for the reply but that is usualy in the range of 0.1-1s per
file. Apt can also already uncompress and checksum the first pdiff and
the rred method can also already apply the first pdiff while the second
downloads. So the loss from not pipelining might not be that noticeable.

And if it actualy becomes a noticeable factor then apt could always do
as they say: Open multiple connections if pipelining is disabled.

MfG
        Goswin


Reply to: