[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Rambling apt-get ideas




Hmm. sorry to step in here, but

On Thu, Dec 28, 2000 at 08:22:52AM -0600, Vince Mulhollon wrote:
> My point being, that yes I already use squid as a proxy server for a whole
> network of apt-geting debian boxes and after only a little work it works

this could also be replaced by NFS, or a small FTP server that
sits on that box that apt-get's the real stuff (similar idea
presented below).

> OK, but something using IP multicast would be better due to lower network
> utilization.  True, doing multiple simultaneous upgrades means eventually

... in exchange for a much higher disk usage - you have to replicate
most of the stuff to every box. The network utilization problem could be
remedied if you could restrict running apt-get during the night, and
if those packages eventually have a good way to be installed w/o
asking the user questions (and possibly killing the NFS server, too,
so ok, you can't upgrade nfs stuff that way).

> to multiple simultaneous clients, last time I checked.  Another cool
> ability of an integrated cache would be that the "fetching" machine could
> maintain a list of all the machines it pushed the new .deb to, and when all
> the "client" machines have a copy of the new .deb, clear it from the cache.

That would also mean that you need to keep track of the total set
of machines on _every_ machine involved. Sounds like a major headache
to me. Why is everyone (else) trying to consolidate things into fewer
departmental servers?

> .deb based upon access time, size, etc.  Even worse, my squid only caches
> files less than 8 megs, thus each machine downloads its own copy of emacs,
You can tell it to cache larger files, too.

> I suppose I could configure my two dozen workstations at work "all at once"
> "remotely" by doing some kind of weird hack with expect and ssh.  But it

What 'expect'? I configure a local server to have top priority for
all packages, and then ssh into each box in turn to apt-get upgrade
them all. You can't predict those configuration questions anyway
if you have not just set the debconf disturbance level to "ignore
everything short of a disk failure".

> Or another example, a network wide shared apt-get cache.  I suppose you
> could just NFS mount all the machines onto one apt-get cache on one
> machine.  There might be file locking issues.  There would be security and
> authentication issues.  The one server would have to have all the disk

Imho: The main problem is that you cant upgrade any of the nfs stuff
or your connection to the nfs server breaks. Running anon-ftp
(eg publicfile) or anon-http (boa?) is even easier and totally avoids
that problem(s).

> lines in /etc/apt/sources.list I could find a way for apt-get to pull .debs
> over freenet, or over FSP, or over DCC chat on IRC.

Ugh. You don't want more "features", but more features that work, no?
Who wants to have _slow_ FSP or DCC transports when just trying
a bunch of HTTP and FTP servers would be way easier and faster, too?
What _would_ be nice, though, is the ability for apt-get to understand
the Debian mirror structure so it doesn't need to fetch all that
Packages files from each mirror you could possibly configure.

Ok. Here is what I do:

- On one machine, I do "apt-get update" and apt-get -f install or
  upgrade or ... with a fairly standard set of resources.
- On that same machine i run a small web server serving those
  downloaded files from /var/cache/apt/archives (why re-invent
  the wheel?).
- Also, on this machine I have a silly cron job that runs
  "make index" using the Makefile presented below. When I need
  this, I just run it by hand.

- On other machines (different packgages sets, too), I have
  this first machine (second machines, ... with similar setups)
  in sources.list.


Here is the Makefile:

------ snip
all:
        echo "make index" for making indices


index:  *.deb
        dpkg-scanpackages . /dev/null > Packages
        gzip -c9 Packages > Packages.gz
        dpkg-scansources . /dev/null > Sources
        gzip -c9 Sources > Sources.gz


clean:
        rm -f Packages* Sources*

------ snip

The next thing would be to properly wrap all that stuff up
to avoid fetching from the real source and go through my
squid instead..., and a script that reduces most of this
to a single command.



Best Regards,
--Toni++



Reply to: