Re: Reasonable maximum package size ?
On 06/05/07 08:58, Frans Pop wrote:
On Tuesday 05 June 2007 15:14, Anthony Towns wrote:
I'm not sure if avoiding duplicating the data (1G of data is bad, but
1G of the same data in a .orig.tar.gz _and_ a .deb is absurd) is enough
to just use the existing archive and mirror network, or if it'd still
be worth setting up a separate apt-able archive under debian.org
somewhere for _really_ big data.
IMO it would be worth it if we could split out gigabytes of data from the
main archive and thus significantly reduce the bandwidth needed for
mirror syncs. Especially if that data is only used by an extremely small
subset of users/developers.
The advantages would be:
- overall reduced use of resources like disk space and bandwidth
- lower the barrier to create local mirrors, not only for home users,
but also for mirrors in areas that are not that well connected to
the rest of the world 
- make it possible to not include such data on the regular binary CDs,
but for example on separate arch-independent "data" CDs
It is likely that this issue will only become bigger with time, so
investing in a structural solution IMO makes sense.
 This was for example a real problem when I was in Bhutan last year.
What about putting such data in a special branch (correct term?),
parallel to main, contrib and non-free? That way, mirror sites can
decide whether or not to mirror it? Call it "hugedatasets"?
(Boring name, but explicitly describes the contents.)
The other idea that I really like is to package a cron script which
periodically examines the data file timestamps on the FTP server and
the timestamps of the local files, and if the FTP files are newer,
send an email to the user. And, of course, provide a script to do
the wgets/curls and data installs. That way, you don't use use
double the space on your system.
P.S. I *Really* Appreciates All The Hard Work You All Do.
Ron Johnson, Jr.
Jefferson LA USA
Give a man a fish, and he eats for a day.
Hit him with a fish, and he goes away for good!