[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Reasonable maximum package size ?

Frans Pop wrote:
On Tuesday 05 June 2007 15:14, Anthony Towns wrote:
I'm not sure if avoiding duplicating the data (1G of data is bad, but
1G of the same data in a .orig.tar.gz _and_ a .deb is absurd) is enough
to just use the existing archive and mirror network, or if it'd still
be worth setting up a separate apt-able archive under debian.org
somewhere for _really_ big data.

IMO it would be worth it if we could split out gigabytes of data from the main archive and thus significantly reduce the bandwidth needed for mirror syncs. Especially if that data is only used by an extremely small subset of users/developers.

The advantages would be:
- overall reduced use of resources like disk space and bandwidth
- lower the barrier to create local mirrors, not only for home users,
  but also for mirrors in areas that are not that well connected to
  the rest of the world [1]
- make it possible to not include such data on the regular binary CDs,
  but for example on separate arch-independent "data" CDs
Debian 15 cd's, 2 dvd + 1 DVD human genome + 2 DVD games + 2 DVD other data :P

this would be really fun ...

and i agree with this ... LOL.

also, i agree with dividing the main package in smaller packages (pkg-1 + pkg-2 = PKG)

also ajt really touched a point: orig.tar.gz would also be hudge. So, maybe introducing some kind of packaging that would "forget" the orig.tar.gz would be nice.

dividing the data separed like main/contrib/non-free/DATA/DATA-non-free/DATA-contrib would be good. A few "faster" mirrors could support the few users that need this kind of packages.
It is likely that this issue will only become bigger with time, so investing in a structural solution IMO makes sense.
makes all sense.


[1] This was for example a real problem when I was in Bhutan last year.

Reply to: