Rambling apt-get ideas
How about an "apt-getd" debian daemon.
Use a apt-get client to remotely mess with another workstations packages.
Messing with only one workstation at a time is boring. How about multicast
to configure a hundred workstations instead, all at once? And then have a
proxying apt-getd server multicast out the .deb files to all the machines
at the same time?
Link the apt-getd daemons on multiple workstations to create a network-wide
virtual .deb cache. If workstation A downloaded the latest libc, then
workstation B's apt-getd will query workstation A and download the package
from workstation A instead of the debian website, automatically. The
ultimate extension would be for your apt-getd on your workstation to query
apt-getd on the main debian website, instead of configuring http or ftp
transport methods. Sort of like hacking proxy features onto apt-get. Or
combine the freenet protocol into it. Using freenet would make
distribution of non-US and non-free very interesting.
Perhaps a shared network workgroup structure. Ten workstations with their
apt-getd's all talking to the server's apt-getd as clients. Install a new
package on the server, and after a configurable random delay, all ten
client workstations will fetch and install the same new package.
Hacking NIS features onto apt-get. "installedpackages.byname"?
How about selectable compression methods to balance processor speed vs
network speed? Distribute debian as a product in .bz2 format over my 56K
modem to my fast Pentium, and then on the fly convert it to proxy totally
uncompressed .debs over my 10 meg ethernet to my 12 meg 386.
How about a new structure for compression. apt-get-gzip provides gzip,
apt-get-bz2 provides bzip, apt-get-uudecode provides uudecode. Automate
downloading of new compression methods or even new transport methods as
packages using the new format are downloaded. Kind of like Microsoft's
video viewer downloaded new codecs automatically...
How about "automatic" "apt-get update"? apt-get automatically does an
update whenever it determines it is needed.
How about eliminating the need for apt-get update for upgrading purposes?
Still need to download the entire cache to browse all the packages, but the
cache is getting big enought that some kind of piecemeal "apt-get update"
or perhaps "diff" based packages would be a good idea.
Is distributing the apt-get package cache via CVS a good idea, or is it
insane, is it a cool hack, or all of the above?
How about apt-get being able to somehow query and find other apt sites? No
need to edit the apt sources list manually, if you try to install
task-helix-gnome it will find helix's site "automatically" and ask
permission before using it.
How about apt pinging the mirrors to find the closest / fastest one and
then using it, no matter where you are on the internet? How about pinging
them all every week to find the closest / fastest 3, and then pinging those
three before each file download to find the fastest mirror at that instant?
How about using fsp as an apt-get transport method? fsp can be set to do
traffic shaping, limiting the usage to perhaps 25% of your connection, if
How about linking apt-get and debconf into one program with all the above
You can already do most of the above using a bunch of different packages
and scripts, but it "might" be easier to setup as "one big daemon".
Problems would be vastly heavier requirements and major security issues.
<alex@fastwav To: firstname.lastname@example.org
e.net> cc: (bcc: Vince Mulhollon/Brookfield/Norlight)
12/27/2000 Subject: Nice features in a pre package manager
In addition to the previously-mentioned pipelining that installs
packages as soon as they're downloaded, so their deb can be deleted,
and has careful sequencing so that a package is not downloaded until
its dependencies have _already_ been met, it would be nice ...
to have the package requesting front end be pipelined and prioritized.
I might have 300MB queued for download and install, but it would be
_really_ nice to have package "xf86config" (for example) ready as soon
as possible. Once that is running, I can find out which graphics card
I need (while more downloading is happening in the background), and
so I ask for "xserver-svga" to get top priority next (while I'm
halfway through running xf86config). Once I've tweaked X the way
I want, during which time (for example) GCC was downloading,
I might ask for "utah-glx" to be prioritized. I then ask for the
kernel source, because I need some features turning on, while
bringing up the 3D support, etc etc etc.
I reckon that being able to have an apt pipeline, with the capability
to promote packages to the head of the list, will halve the time it
takes me to bring up a workstation system.
Configuring takes user time, downloading takes network time, unpacking
takes disk drive time. None of them are really competing for resources,
unless you've got <32MB of memory and dpkg is swiped the whole lot.
These tasks should all happen in parallel.
Suppose the feature is added as an "apt-get queue" to distinguish from
"apt-get install", where the former gives the prompt back immediately
and communicates with an existing apt-get (or starts a background task).
The latter plays nice if there is already an apt-get running, gets
the to-do list modified and then sleeps until the other process reports
that it has completed the request (or is being killed).
To UNSUBSCRIBE, email to email@example.com
with a subject of "unsubscribe". Trouble? Contact