[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: get debian smaller



You do not have to install all 14 cd's to debian work, you only have to install the first cd. In fact I user only two disks from the site, the recover and root, and install by internet.
Red Hat need a lot of cd's too if you want to install all the softwares that debian has. 

Eduardo

On Sat, 11 Jan 2003 20:54:41 +0100
Christian Fasshauer <mseacf@gmx.net> wrote:

>   Hello!
> 
> I'm currently RedHat Linux user and have got to know GNU Debian Linux, 
> which seems to be much more powerful than RedHat. Perhaps I'll be a 
> Debian user soon. But unfortunality it is a very large distribution with 
> 14 CD's. That's why I have decided to do something for getting Debian 
> smaller.
> You are currently using the tar archiver and gzip with the maximal 
> compression rate for creating the deb files. That's more efficient than 
> most of the packaging systems I've ever seen but I think there could be 
> done more to build the archives smaller. Tar puts a lot of unused 
> garbarge inside the output, which is well compressable, of course, but 
> there are ways to get it even smaller. Unfortunately, my suggestion 
> introduces a very new archive design which isn't backward compatible. 
> The first step of generation is to collect all users and groups of the 
> files due to involve. This stuff comes after the uncompressed section 
> size, so each group and user name will placed only one time in the 
> archive and each node has one word to identify user and group. To save 
> space, there are only 256 user/group entrys available for one archive 
> section. The next part is the fs structure with date/time informations, 
> attributs as well as the user and group byte and the entry name 
> themself. Each entry has a size of 13 bytes + entry name length or a 
> symlink path or device numbers for block or char nodes. It is not the 
> whole path defined for a new format just like in tar but only what is 
> minimally neccessary to describe the file structure. The new structure 
> supports even more node types than tar. Actually all available under 
> Linux. After all there comes the file contents. BZip2 compresses the 
> whole archive. At the beginning I thought to reach a much better 
> compression rate than before but gzip is not so bad as I thought. 
> However file - rich archives or large text documents may become a lot 
> smaller, especially the dev system or the kernel sources. (On my system 
> a archive with the new deb format, containing the dev tree is 63% 
> smaller than that, compressed with the current format) The process is 
> much slower especially the archive creation, but one advantage to the 
> current system is the content listing speed, which is a lot faster than 
> now. And there is no dependency to the tar tool anymore.
> You will find an example solution attached, which demonstrates the 
> archive creation and extraction. For creation you need the additional 
> "a" - argument to enable the new format.
> I have no idea if you want this kind of support - all I know about 
> debian is, there are a lot of rules and code changes needs much time to 
> get into the code - so I have stopped developing after my main minds 
> were working. But I think there could be done much with dpkg to get it 
> better. For example a diff package for the several kernel source 
> versions which inherits from one host package. This could provide a more 
> user friendly interface for the arch - kernel source - patches. Then the 
> user could get an extracted source tree in /usr/src/linux-xxxx. Another 
> thing could be to design a deb format which holds the source packages in 
> one file.....
> 
> With kind reguards
> Christian Fasshauer
> 



Reply to: