Currently, we have a problem with compressed HTML: we can access compressed HTML fine, but links don't work very well. The problem is that the link says "foo.html", and the actual file is "foo.html.gz", and the browsers and servers aren't intelligent enough to handle this invisibly. This means that we can't install compressed HTML, if it contains links. We need a program that can be run on uncompressed HTML, which converts local links to the compressed versions of the files. Usage would be something like: fixhtmlgz file.html ... - read file.html - for each link <a href="foo.html">, if foo.html exists, convert the link to foo.html.gz instead - otherwise, do not modify the link - output is either to file.html.fixed or file.html (replace original with modified version) I don't have time to search for or write this. Does anyone know of a suitable existing program? Failing that, could someone write such a program? I think it should be fairly simple to do (there are existing libraries for parsing HTML in at least Python and I assume Perl). -- Please read <http://www.iki.fi/liw/mail-to-lasu.html> before mailing me. Please don't Cc: me when replying to my message on a mailing list.
Attachment:
pgpd3ghBty20o.pgp
Description: PGP signature