[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: apt-get wrapper for maintaining Partial Mirrors



Joseph Rawson <umeboshi3@gmail.com> writes:

> On Saturday 20 June 2009 03:16:33 Goswin von Brederlow wrote:
>> But now you made me think about this too. So here is what I think:
>>
>> - My bandwidth at home is fast enough to fetch packages directly. No
>>   need to mirror at all.
>>
>> - I don't want to download a package multiple times (once per host) so
>>   some shared proxy would be good.
>>
> My idea would keep that from happening, at the expense of latency.  The 
> latency would be minimal, as it would just be dependant on reprepro 
> retrieving the package(s) and signalling the client that the package is 
> ready.  Using reprepro to add extra packages to the repository from upstream 
> without doing a full update may not be possible, but if it were, the latency 
> would certainly be minimum, and the bandwidth to the internet would also be 
> minimum.  I just looked at the manpage again, and this may be possible by 
> using the --nolistsdownload option with the update/checkupdate command.
>
>
>> - Bootstraping a chroot still benefits from local packages but a
>>   shared proxy would do there too.
>>
>> - When I'm not at home I might not have network access or only a slow
>>   one so then I need a mirror. And my parents computer has a Linux that
>>   only I use and that needs a major update every time I vistit.
>>
>> So the ideal setup would be an apt proxy that stores the packages in
>> the normal pool structure and has a simple command to create
>> Packages.gz, Sources.gz, Release and Release.gpg files so the cache
>> directory can be copied onto a USB disk and used as a repository of
>> its own.
>>
> Getting reprepro to do this would save a lot of the hassle, but getting 
> reprepro to act as an apt proxy is also tricky.  The current cache and proxy 
> methods in the apt-proxy and apt-cache packages don't work as well in making 
> a good repository, as opposed to reprepro.
>
> The Release could be signed using an rsign method with the machine(s) that 
> manage the repository, or it could be done locally on the server using 
> gpg-agent, or an unencrypted private key, depending on how the administrator 
> prefers to manage it.

The simplest implementation would be a tiny proxy applet that, when a
deb file is requested, checks if the file is in the local
archive. If it is then send it. If not then request file from
upstream and pipe it to apt (no latency) and a tempfile. When the
download has finished then reprepro --include suite deb. Doing the
same for source is a little more tricky as you needs the dsc and
related files as a group.

>> Optional the apt proxy could prefetch package versions but for me that
>> wouldn't be a high priority.
>>
>> Nice would be that it fetches sources along with binaries. When I find
>> a bug in some software while traveling I would hate to not have the
>> source available to fix it. But then it also needs to fetch
>> Build-depends and their depends. So that would complicate matters a
>> lot.
> I mentioned that part above.
>>
>> MfG
>>         Goswin
>
> Overall, I think that reprepro does a good job of maintaining a local 
> repository, and we shouldn't reimplement what it does.  Reprepro also seems 
> flexible enough to implement most of the backend with simple commands and 
> options.  I've never tried to implement a new apt-method before, so I think 
> that would take a bit more research from me.

I totally agree that reprepro as the cache/storage backend would be
great use of existing software.

The problem I have with it being an apt method is that the apt method
runs on a different host than the reprepro. That would require ssh
logins from all participating clients or something to alter the
reprepro filter.

MfG
        Goswin


Reply to: