On Tuesday 05 June 2007 15:14, Anthony Towns wrote: > I'm not sure if avoiding duplicating the data (1G of data is bad, but > 1G of the same data in a .orig.tar.gz _and_ a .deb is absurd) is enough > to just use the existing archive and mirror network, or if it'd still > be worth setting up a separate apt-able archive under debian.org > somewhere for _really_ big data. IMO it would be worth it if we could split out gigabytes of data from the main archive and thus significantly reduce the bandwidth needed for mirror syncs. Especially if that data is only used by an extremely small subset of users/developers. The advantages would be: - overall reduced use of resources like disk space and bandwidth - lower the barrier to create local mirrors, not only for home users, but also for mirrors in areas that are not that well connected to the rest of the world  - make it possible to not include such data on the regular binary CDs, but for example on separate arch-independent "data" CDs It is likely that this issue will only become bigger with time, so investing in a structural solution IMO makes sense. Cheers, FJP  This was for example a real problem when I was in Bhutan last year.
Description: PGP signature