Re: Centralizing apt-get downloads
[sorry Dave - sent this to you by mistake]
Dave Sherohman wrote:
> On Tue, Sep 18, 2001 at 11:17:19AM -0400, Andrew Perrin wrote:
> > My question is this: effectively, each time they upgrade, I'm downloading
> > three copies of each package separately. Is there a relatively easy way to
> > archive the files locally and have the two boxen behind the ipmasq'ed
> > computer just get their updates from it?
> Sure. Here are three:
> - Check out the apt-move package if you want to build your own
> partial mirror of the official archives
> - Roll your own archive using dpkg-scanpackages if you don't want
> updates to come down automatically
> - Use NFS to export /var/cache/apt to all machines - apt-get update
> will run separately on each box, but when it comes time to install
> packages, they're already in /var/cache/apt/archives, so they won't
> be re-downloaded
And I believe a fourth is to use squid - slightly less intelligent, but
if all your sources.list files are the same, all the urls should be the
I assume it's possible to tell apt to use a proxy?
Also it might require tweaking squid's config to cache such big files.
You might have guessed I haven't tried this :-)