[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: RFDisscusion: Big Packages.gz and Statistics and Comparing solution



[A quick reply. And thanks for discuss with me! And no need to Cc: me
anymore, I updated my DB info.]

On Sun, Jan 07, 2001 at 05:51:26PM +0100, Goswin Brederlow wrote:
> The problem is that people want to browse descriptions to find a
> package fairly often or just run "apt-cache show package" to see what
> a package is about. So you need a method to download all descriptions.

The big Packages.gz is still there. No conflict between the two method.
And the newest, most updated information is always on freshmeat.net. ;)

> As far as I see theres no server support needed for rsync support to
> operate better on compressed files.

Um, I don't know. But doesn't RSYNC need a server side RSYNC to run?
Or, can I expect a HTTP server to provide RSYNC? (Maybe I am stupid,
I'll read RSYNC man page, later.)

> If you update often, saving 1 Byte every time is worth it. If you
> update seldomely, it doesn't realy matter that you download a big
> Packages.gz. You would have to downlaod all the small Packages.gz
> files also.

There is an approach to help this. But that is another story. Later.

> So you see, between potato and woody diff saves about 60%.
> Also note that rsync usually performs better than cvs, since it does
> not include the to be removed lines in the download.

Pretty sounding argument. My only critic on DIFF or RSYNC now is just
server support now. (Again, I'll read RSYNC man page later. ;-)

The point is, can a storage server which provides merely HTTP and/or
FTP service do the job for apt-get?

-- 
echo <<EOF |cpp - -|egrep -v '(^#|^$)'
/*   =|=X ++
 *   /\+_ p7 <zw@debian.org> */
EOF



Reply to: