[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Lintian report that man pages are not compressed with --best



Hi.

How does lintian (or I) see if a man page is compressed with the --best
option?

Right now it seems that there is an error somewhere. Let me explain: when
packaging greed, lintian reported that the man page was not compressed with
the --best option. Ok, I've decompressed it and compressed again with this
switch, and it gave the same error. What the $%&#?
However, if I do this 'gzip -dc greed.1.gz | gzip -9 > greed.1.gz' lintian
does not report any error, and the compressed file is 8 bytes shorter
(AFAICT, this happens because this way the name of the file is nor recorded
in the compressed file). Also, if I decompress the file, and leave it up to
debhelper to compress the file, lintian reports no error, and the file is
the same size as the original one (the one that lintian complained before).

Can somebody explain or give any suggestion as to what is happening?
I've attached this three files to this post:

-rw-r--r--    1 digito   users        1091 May  1 21:38 greed-pipe.1.gz
-rw-------    1 digito   users        2205 Jan 20 18:31 greed-uncomp.1
-rw-r--r--    1 digito   users        1099 Apr 25 23:45 greed.1.gz

greed.1.gz      - Original file: lintian complains about this file
greed-pipe.1.gz - Obtained with gzip -dc greed.1.gz | gzip -9 > greed-pipe.1.gz
                  lintian clean
greed-uncomp.1  - Decompressed file: lintian clean


FWIW:

-==============-==============-============================================
ii  lintian        1.11.2         Debian package checker
ii  gzip           1.2.4-33       The GNU compression utility.
ii  debhelper      2.0.86         helper programs for debian/rules


Does anybody know what's going on?
Thanks.

-- 
Pedro Guerreiro						 UIN: 48533103
Universidade do Algarve (EST) - Campus da Penha - 8000 Faro - PORTUGAL
GPG: 0xCF32D4E7    F506 DDF4 0B92 247D B8E6   13BA A6DB 9E3A CF32 D4E7

Attachment: greed.1.gz
Description: Binary data

Attachment: greed-pipe.1.gz
Description: Binary data

'\"
'\" greed 1
'\" 
.TH greed 1 "" greed "\fBG\fRet and \fBR\fResume \fBE\fRlite \fBED\fRition"
.BS
'\" Note:  do not modify the .SH NAME line immediately below!
.SH NAME
greed \- Lets you get and resume files.
.SH SYNOPSIS
\fBgreed  [-sn]  [-o{0..1}] [URL0 URL1..URLN]
.BE

.SH DESCRIPTION
.PP
Greed lets you download and resume files from web and FTP sites.

.SH OPTIONS
.TP
.BR \fB-?\fR
	Brings up quick help
.TP
.BR \fB-z\fR
	Sends a bug report to me
.TP
.BR \fB-t\fR
	Tests the file if it is a valid archive after the download is finished
.TP
.BR \fB-s\fR
	Prints the file downloaded to STDOUT
.TP
.BR \fB-r || -rURL\fR
	Turns off the http referrer and changes the referrer to URL, respectively
.TP
.BR \fB-o{0..9}\fR
	0..9 is the level of output GREED prints
.TP
.BR \fB-n\fR
	Downloads the newest version of GREED
.TP
.BR \fB-mADDRESS\fR
	Sends an e-mail to ADDRESS every time greed downloads a file (-t MUST be used)
.TP
.BR \fB-l#\fR
	Sets the level of recursion (how many directories down greed should go on an FTP site)
.TP
.BR \fB-i\fR
	Reads URLs from Standard Input
.TP
.BR \fB-gfilename.grx\fR
	Loads a .grx file (Windows GetRight), parses it, and downloads the files specified within
.TP
.BR \fB-f\fR
	Disables reading URLs from greed.in
.TP
.BR \fB-d#\fR
	Tells greed to download X files at a time.  If no number is specified, it assumes 2 at a time.
.TP
.BR \fB-cURL\rR
	Changes the %##'s in a URL to characters... should old be used for URL's that don't work correctly, or have %##'s in them ;)
.TP
.BR \fB-b\fR
	Sends GREED into background download mode.

.SH COOL TIPS
.TP
\fBgreed -i < files.to.download\fR
	Lets you download all the files in files.to.download
.TP
\fBgreed -d -b http://foo.bar/file.zip http://ab:cd@efg.hij/kl/mno.p\fR
	Drops greed into the background and downloads the 2 URLS simultaineously!
.TP
\fBgreed -b -t -mHarry@GreedySite.com -l9999 ftp://ftp.cdrom.com/\fR
	Sends e-mails on each successful download and downloads ftp.cdrom.com... and I mean all .5 terrabytes :)  Give it a try ;)
.TP
\fBgreed www.freshmeat.net...freshmeat.index.html\fR
	"..." saves the URL retrieved from www.freshmeat.net to freshmeat.index.html

.SH KEYWORDS
resume download ftp www

Reply to: