Re: Bundling .debs
We are managing around 120 computers under Debian, first with an
unstable snapshot made between potato and woody, and now woody.
They are not all of the same type (let say 4 different models),
with 3-4 different software configurations.
All the installation process is managed by FAI  (available on Woody) and
is good enough for our needs. And perhaps for yours.
The main problem we had and that is not solved by the solution proposed
above is about packages which are not using debconf. Exim is an example.
About distributing debconf data, you may use an LDAP backend.
We didn't test it but IMHO instead of copying the whole data, just configure
debconf seems better.
> I think what I'd do if I were to maintain a cluster:
> create a local debian repository, with a local 'beta' and 'final'
> distribution (like testing and stable, but I think it's better not to
> use those names).
> The test machine(s) obviously point to beta, the cluster to final. So,
> every new package would be put into beta, where it could be tested and
> then it would be propagated into final (totally manual process. Could be
> semi-automated by syncing beta to (for instance) Debian stable+security
> on a regular basis).
> The only remaining problem would then be distributing debconf data. I'm
> not familiar enough with that to have a proposal here, but I'm sure you
> can come up with something.
> A cron job on all production machines would then look like:
> - sync debconf db with master
> - apt-get update && apt-get dist-upgrade
> (Hmmm. this does only cover installed pkgs. So a way to make sure new
> packages are installed is required, too. But, since I would only expect
> to-be-installed packages in the local 'final' repository, this is easy.
> grep the Packages file and install everything available).
> -- vbi
> this email is protected by a digital signature: http://fortytwo.ch/gpg