Re: Stop archive bloat: 47MB gmt-coast-full_19991001-1.deb
- To: email@example.com
- Subject: Re: Stop archive bloat: 47MB gmt-coast-full_19991001-1.deb
- From: Rob Browning <firstname.lastname@example.org>
- Date: 06 Nov 1999 18:39:24 -0600
- Message-id: <email@example.com>
- In-reply-to: Brian Mays's message of "18 Oct 1999 15:05:06 -0400"
- References: <firstname.lastname@example.org> <email@example.com> <19991018131316.A12108@watervalley.net> <firstname.lastname@example.org>
Brian Mays <email@example.com> writes:
> Exactly. If the data is available elsewhere on the Internet, IMHO
> it is better to package a mechanism for retrieving this data and
> installing it into its proper place in the filesystem.
The problem is that this is untenable for users who either don't have
net access or who have to pay through the nose for it. That said, for
most people, the install package mechanism is a better idea.
But I think with a little work, we could accomodate both groups quite
well. We could define an very simple "installer package mini-policy"
which, if implemented, would make it easy to semi-automatically build
a CD with all the bits that a given set of installer packages needs
stored where the installers can find them.
For example, we could require that all installer packages have a
script like <packagename>-getdata that when called fetches all the
data files that that package needs and put them into a subdirectory
named after the package, within a specified top-level directory. So
to build a CD for a given set of installer packages, you could say:
for p in $installerpackages
and you'd end up with:
Then the packages would just have to know where to look for them (or
Rob Browning <firstname.lastname@example.org> PGP=E80E0D04F521A094 532B97F5D64E3930