Re: package pool and big Packages.gz file
>>>>> " " == Junichi Uekawa <firstname.lastname@example.org> writes:
> In 05 Jan 2001 19:51:08 +0100 Goswin Brederlow
> <email@example.com> cum veritate
> scripsit : Hello,
>> I'm currently discussing some changes to the rsync client with
>> some people from the rsync ML which would uncompress compressed
>> data on the client side (no changes to the server) and rsync
>> those. Sounds like not improving anything, but when reading the
>> full description on this it actually does.
>> Before that rsyncing new debs with old once hardly ever saves
>> anything. Where it hels is with big packages like xfree, where
>> several packages are identical between releases.
> No offence, but wouldn't it be a tad difficult to play around
> with it, since deb packages are not just gzipped archives, but
> ar archive containing gzipped tar archives?
Yes and no.
The problem is that deb files are special ar archives, so you can't
just download the files and ar them together.
One way would be to download the files in the ar, ar them together and
rsync again. Since ar does not chnage the data in it, the deb has the
same data just at different places, and rsync handles that well.
This would be possible, but would require server changes.
The trick is to know a bit about ar, but not to much. Just rsync the
header of the ar file till the first real file in it and then rsync
that recursively, then a bit more ar file data and another file and so
on. Knowing when subfiles start and how long they are is enough.
The question will be how much intelligence to teach rsync. I like
rsync stupid but still intelligent enough to do the job.
Its pretty tricky, so it will be some time before anything in that
direction is useable.