[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Reasonable maximum package size ?



Hi,

I'm packaging some neuroimaging tools that come with datasets that
are required for those tools to work properly. The size of these
datasets is up to 400 MB (some others at least well over 100 MB).

My question is now: Is it reasonable to provide this rather huge amount
of data in a package in the archive?

An alternative to a dedicated package would be to provide a
download/install script for the data (like the msttcorefonts package)
that is called at package postinst.

Another alternative would be to provide a package like the
(googleearth-package)-package. This would have the advantage that the
users could easily build packages, that they can distribute themselves.


Arguments for download wrappers/package-maker would be:

 - only the datasets fill yet another CD
 - only very few people actually benefit from this package (it is a rather
   very-special-interest-package)
 - datasets change infrequently
 - saves a lot of diskspace in archive and mirrors

Arguments for a package:

 - much easier to handle for users (thinking of offline machines)
 - if upstream goes offline, the relevant software package in the archive are
   basically useless as the required datasets are not distributed anymore
 - diskspace is rather cheap and bandwith should be no problem as the
   number of downloads will remain relatively low.

There was already a little discussion about this, starting from here:

http://lists.debian.org/debian-devel/2007/05/msg00207.html


I'd like to hear you comments about this.


Thanks,

Michael


-- 
GPG key:  1024D/3144BE0F Michael Hanke
http://apsy.gse.uni-magdeburg.de/hanke
ICQ: 48230050

Attachment: signature.asc
Description: Digital signature


Reply to: