[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#658346: APT does not work with a repository at Sourceforge



David,

Thanks for the quick response.

> I have to ask: Where do you got that version from? ;)
> (as this potential confuses the hell out of the BTS i removed that
>  marker and set oldstable as found instead.)

It is the current version with Ubuntu 11.10, but I the same
problem exists on older versions of apt.

apt-get --version
apt 0.8.16~exp5ubuntu13 for i386 compiled on Oct  6 2011 15:25:29


>> When attempting to install a package which is held in a repository at
>> Sourceforge, it reports the error:
>> 
>> Err...
>>  Got a single header line over 360 chars
>> Failed to fetch...
>>  Got a single header line over 360 chars

> Can you tell us the repository for testing proposes?

Sure, it is repository for a package called openfoam210 for Ubuntu.
It is described here:
http://www.openfoam.org/download/ubuntu.php

The repository is set up by copying the following into a terminal:

VERS=`lsb_release -cs`
sudo sh -c "echo deb http://www.openfoam.org/download/ubuntu $VERS main > /etc/apt/sources.list.d/openfoam.list"
sudo apt-get update

Then the following should install the package:
sudo apt-get install openfoam210

but fails when apt attempts to download the package from Sourceforge
and Sourceforge tries to allocate the download to one of its mirror
servers, using a very long URL.

The problem is also discussed here:
http://www.openfoam.com/mantisbt/view.php?id=399

>> Simply increasing MAXLEN to 500, or 1000 to be safe, would fix this
>> problem.

> This limit looks pretty random, i guess we could life without that
> entirely, but i will take a bit of time to meditate about the possible
> reasoning behind this (as it is included and unchanged since nearly
> the start of APT development back in the last century…).

It may be that this is a hang up from the last century.  A search on
the internet indicates that servers can pretty much handle any length
of URL, but some browsers etc are limited to a little over 2000.  As
one guy put it "the real world limit for URLs is about 2000
characters" which seems to carry some truth.

If there is concern about removing the limit, how about increasing it
at least, in which case 2000 could be a good starting point.

Thanks,

Chris



Reply to: