On Wednesday 18 February 2004 22.59, Joey Hess wrote: > More like a kilobyte per package, per Packages file. Yes, I have done > the math, and it's not clear to me if a few kilobytes downloaded daily > by many of our users, some on thin pipes, has a lesser cost than a few > megabytes sitting in a few mirrors. Especially since bandwidth is > generally more expensive than disk. That's why I asked where the dividing > line is. Incremental Packages files would solve this problem (rsync is said to be too resource intensive on the servers, so let's use diffs instead and have the last 30 dayily diffs available. When he runs apt-get update rarer than that, it shouldn't be a problem if he needs to dl the whole Packages file. Similarly: why not create incremental packages? Especially for beasts like tetex, 99% of the package remains unchanged between releases (especially if only the Debian revision changes). I'm on broadband (well, 300kbps), and updating tetex or OO.org or some of the other big packages is a bit annoying (but manageable). Of course, incremental packages would be technically more difficult than just diffs of the Packages file, but I guess using xdelta on the uncompressed archives might do the trick. If there are too many big compressed files in packages, the problem becomes more difficult, of course. I think Debian could find a lot more users and developers if it would be less bandwidth intensive - there are a lot of places where 2Mb are shared for 100s of students - and it is exactly these places where Linux is attractive because there's no license fees to pay. (Ok, ok, I know, I should now write the code for all this... No time, like everybody else). cheers -- vbi -- You are what you see.