[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Question about proper archive area for packages that require big data for operation



On 23/04/13 10:48, Laszlo Kajan wrote:
> free packages that depend on big (e.g. >400MB) free data outside 'main'

This comes up in the Games Team, too.

Here are some possibilities you might not have considered:

* Package a small "demo" data-set (enough to test that the package is
  working correctly) in main; provide instructions to get the
  "full-fat" data-set from elsewhere. I think VegaStrike used to do
  this with its music, shipping a lower-quality encode in Debian and a
  full-quality encode elsewhere? Games also often do this for legal
  rather than size reasons, with an engine in contrib, demo/shareware
  data in non-free, and instructions to replace the demo data with the
  non-distributable full game if you own it; e.g. Quake II used to be
  packaged like this.

* Split the data-set into reasonably-sized packages so it at least
  gets better incremental downloads and splitting between CDs/DVDs
  (bonus points if the source packages are segregated by update
  frequency, so only the frequently-updated parts normally need
  uploads). I did this with openarena-data (after some brief discussion
  with the ftp-masters and the debian-cd maintainer) because I was sick
  of uploading half a gigabyte of textures, etc. every time there was a
  bug in the game scripting. They suggested that I should aim for 100MB
  packages as a reasonable compromise between splitting too coarsely
  and too finely.

One of these days I might try to construct a "demo" build of OpenArena
(2 player models, 2 levels and all the weapons, or something;
incompatible with "the real OpenArena" for network games, but running on
the same engine, and considerably smaller).

    S


Reply to: