Re: Rambling apt-get ideas
Yes, that was kind of my point.
An analogy would be that we don't need dpkg because most of its
functionality could be done by a mixture of tar, gzip, and perl (and maybe
make to handle dependancies).
My point being, that yes I already use squid as a proxy server for a whole
network of apt-geting debian boxes and after only a little work it works
OK, but something using IP multicast would be better due to lower network
utilization. True, doing multiple simultaneous upgrades means eventually
an upgrade would kill all the machines simultaneously, and my high end
pentiums are going to decompress the gzip parts much faster than my old
386s, although there are probably ways around that, just because all the
.debs are distributed all at once in one multicast burst doesn't mean they
have to be installed all at once. Anyway, squid does not do IP multicast
to multiple simultaneous clients, last time I checked. Another cool
ability of an integrated cache would be that the "fetching" machine could
maintain a list of all the machines it pushed the new .deb to, and when all
the "client" machines have a copy of the new .deb, clear it from the cache.
With a squid solution, squid has to guess if its OK to clear the cached
.deb based upon access time, size, etc. Even worse, my squid only caches
files less than 8 megs, thus each machine downloads its own copy of emacs,
etc. A cache for general web use "works", but a cache designed
specifically for .deb packages would work better.
Most of the ideas I brought up, I already do something similar, in a manual
and hackish manner, involving ugly perl scripts and config files I would be
embarassed to publicly show.
I suppose I could configure my two dozen workstations at work "all at once"
"remotely" by doing some kind of weird hack with expect and ssh. But it
might be cooler to do that with directly with debconf, again using IP
Or another example, a network wide shared apt-get cache. I suppose you
could just NFS mount all the machines onto one apt-get cache on one
machine. There might be file locking issues. There would be security and
authentication issues. The one server would have to have all the disk
space for the cache. And it would be a manual PITA to the configure for
each machine involved. Would be cooler, cleaner, and more efficient to
have the system do the same functionality as a core feature.
Another example is adding transport protocols to apt-get. I suppose given
a strange brew of named pipes, NFS mounts, loopback devices, and "file:"
lines in /etc/apt/sources.list I could find a way for apt-get to pull .debs
over freenet, or over FSP, or over DCC chat on IRC.
The general idea of my post is that I do some unusual hacks involving
apt-get already, and I can think of even stranger and more useful hacks.
But why make and use a wierd custom hack, when the idea could be cleanly
built right into the infrastructure instead, for everyone to automatically
and easily use? (although I don't know enough apt-get to do it myself)
Zimmerman To: firstname.lastname@example.org
<email@example.com cc: (bcc: Vince Mulhollon/Brookfield/Norlight)
rg> Fax to:
Sent by: Matt Subject: Re: Rambling apt-get ideas
On Wed, Dec 27, 2000 at 02:03:14PM -0600, Vince Mulhollon wrote:
> How about an "apt-getd" debian daemon.
> Use a apt-get client to remotely mess with another workstations packages.
> Messing with only one workstation at a time is boring. How about
> to configure a hundred workstations instead, all at once? And then have
> proxying apt-getd server multicast out the .deb files to all the machines
> at the same time?
You can do this already with apt-get and squid.
To UNSUBSCRIBE, email to firstname.lastname@example.org
with a subject of "unsubscribe". Trouble? Contact