Re: fixhrefgz unnecessary when fixing web-browsers in the correct way
firstname.lastname@example.org (Douglas L Stewart) wrote on 29.06.97 in <Pine.LNX.3.96.970629145338.13613D-100000@tribble>:
> On Sun, 29 Jun 1997, Jim Pick wrote:
> > I just did a "du -s /usr/doc" on my 386DX/33 (8MB RAM, 2-200MB HD) - and
> > it only has 11MB of docs installed. So uncompressing those isn't going
> > to kill me - I'm sure most other people using old hardware have similar
> > usage.
> > Who objects?
> I do. Text often gets up to a 10-1 compression ratio. It may be 11M all
Actually, Text very seldom gets a 10-1 ratio. 2.5-1 or 3-1, yes, with good
compressors (like gzip). But 10-1? I've only ever seen stuff like that
with fixed-record databases or Fortran sources. Those can even go to 100-1
> compressed, but if it's 100M uncompressed then you're going to annoy a lot
> of people.
What I'd really like here is a filesystem that can look into .tar.gz
files, at least read-only. That would solve all our doc compression
How hard would it be to do that?
TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
email@example.com . Trouble?
e-mail to firstname.lastname@example.org .