Why not have a "distributed" package?
We were trying to run a network of Debian Linux machines for the PhD
students of the CS department at NYU with a following layout:
standard client with basic functionality, say,
basic software development tools
X windows with, perhaps, one window manager
other basic utilites
the main point of the client is to be a fully autonomous system with basic
functionality in case of a server failure
server is essentially a superset of a client, having all the extra client
packages installed under, say, /common file tree, exported read-only via
This buys us keeping big (and/or) quickly changing (and/or) rarely used
packages like Gnome, KDE, obscure compilers, etc. off the local disks of
The idea is to simplify upgrade, reduce upgrade bandwidth (instead of
upgrading Gnome or KDE on every client, just upgrade the server), reduce
local client disk storage requirements. In order to do that we needed the
functionality to install packages under a different directory.
Unfortunately we ran into the following problem:
Take, for example installing WindowMaker under /common:
given that it is an X application, it has some hooks in the install
scripts to the menu system. The menu hierarchy and all its hooks are in
/etc/... Installation fails due to configuration errors.
I understand that this would violate the Filesystem Standard but this
functionality is essential if Debian is planning to penetrate the market
of departmental-wide installation. Software distribution and update with
typical release cycles of a couple of months (which is very common to the
Linux world) is an extremely painful process. Keeping all things local and
using apt-upgrade for a moderate 700MB installation is not a solution even
if you have a dedicated mirror of the Debian ftp site. I am not sure what
could be done in this respect. I love the ability of Debian to update the
menu system and there should be ways of maintaining the tight integration
of packages with the menu system and letting a distributed installation
Perhaps an idea of a package that is installed as a "distributed" package
could work. Split the package archive files from the configuration. The
archive goes onto a server, while the configuration is installed on the
clients. This could also solve dependency problems (say I want to install
something that depends on a distributed package, so this could help).
Of course, it would be nice to have some failure detection also, say when
a distributed package becomes unavailable due to server failure or
whatever, the entire dependency subtree of packages becomes unavailable
and the system says that due to dependency issues this package is
temporarily disabled. Well, this would, perhaps require a filesystem with
more features than ext2 has, for example, by having a field in the i-node
associated with a package database lookup that needs to be done on an
access to this executable file.
Oops... I am going on a tangent here. Too much dreaming... Anyway, the
point of a distributed package is crucial.
I am not aware of the TODO list of dpkg or apt (could not get hold of the
URL), so, please, don't be too harsh on me if this is what you have in
development or planning to.
And yes, my last name is Akkerman and I don't think I am in any way
related to Wichert. And yes, my favorite distribution is Debian. And,
unfortunately, the CS departement is going with RedHat for the base system
:( (I know, rpm does not have this functionality either, so somebody here
at NYU who knows RedHat better than Debian came up with a different
installation/upgrade strategy (distribution-neutral but used RedHat) which
ideally should have been handled on the level of the package manager with