[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Required help on local Debain mirror



On Wed, Aug 30, 2017 at 12:31:06PM -0500, David Wright wrote:
> On Wed 30 Aug 2017 at 17:27:31 (+0200), Christian Seiler wrote:
> > Hi there,
> > 
> > Am 2017-08-29 11:57, schrieb Kala Techies:
> > >I am using (Debian GNU/Linux 6.0.10 (squeeze)) in my environment and I
> > >want to update all systems using one local mirror.
> > 
> > I don't think it's a good idea to setup a real local mirror,
> > as that means you'll download the entire archive, which is
> > likely going to be a _lot_ more stuff (especially if you
> > download all available architectures) than upgrading each
> > machine individually.
> > 
> > What you'll rather want is to setup a local proxy server
> > that'll cache the packages. This way you'll only download
> > what you actually need, but you'll also only download it
> > once.
> > 
> > I can recommend the apt-cacher-ng package for that.
> 
> However, be prepared for problems if you run a version of
> apt-cacher-ng as old as squeeze's.
> 
> I still run apt-cacher-ng on a wheezy machine and have had to switch
> between the backports and backports-sloppy versions, currently the
> latter, 0.9.1-1~bpo7+1. The main failures have been (1) expiration of
> old packages¹, (2) new compression schemes² for Packages files, (3)
> new InRelease files and (4) servicing apt-listbugs³ searches. I use it
> for wheezy and jessie, but have made no attempt to use it with
> stretch; is that when hashed indexing started?
> 
> I don't know how many of these issues will affect a constituency of
> totally squeeze PCs; I guess that depends on whether the mirrors
> being used have been updating their apt methods, and if there are
> squeeze backports.
> 
> ¹ie the archive grows for ever.
> ²eg .xz and/or .bz2 files.
> ³my current command sequence for upgrading is the unwieldy
> # apt-get -o Acquire::http::Proxy="http://192.168.1.19:3142/"; update
> # apt-get -d -o Acquire::http::Proxy="http://192.168.1.19:3142/"; upgrade
> # apt-get upgrade

Some years ago I had much success running squid proxy server on a
gateway / router box, and as long as your sources for your internal
network boxes are used protocols cached by squid, it works well.

apt-cacher-ng and the like are a little more intelligent though,
AFAIK, knowing to immediately discard things which always change for
example... but perhaps someone has already written equivalent squid
rule files...

As a bonus with squid - it caches other things - e.g. images, web
content etc - someone watches a 300MiB YouTube and emails a
co-worker, who immediately also watches it - the second one hits the
cache. That was years ago though, so these days you might need extra
config for such sites:

https://wiki.squid-cache.org/ConfigExamples/DynamicContent/YouTube

https://serverfault.com/questions/9281/how-can-i-cache-youtube-videos-with-squid-cache

How well or possible or what's required to "break HTTPS" and make
squid work with e.g. https://youtube.com is another question :)


Reply to: