[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: fixhrefgz - tool for converting anchors to gzipped files

On Sat, 28 Jun 1997, Christian Schwarz wrote:

> Why? The files are called ".html.gz" in the file system. Thus, these links
> are valid. We only have to implement on-the-fly decompression on some web
> servers. (This functionality could be useful for others, too, so we could
> forward our patches to the upstream maintainers of the web servers as
> well.)


GET http://localhost/hello.html.gz
Content-Type: text/html

[uncompressed HTML]

 This is non-standard... the file in the HD exists, httpd is supposed to
send it as is, and using the suffix `html.gz' for every uncompressed HTML
documentation would be strange, or even annoying for a user trying to
`save as' the file in w95.

 I think that Christoph's idea is the elegant way of doing this. The www
server could even be just something like...

read req
req=${req#GET }
req=${req% HTTP*}
if [ -r $req ]; then
        echo HTTP/1.0 200 OK
        echo Content-type: text/html
        cat "$req"
        if [ -r $req.gz ]; then
                echo HTTP/1.0 200 OK
                echo Content-type: text/html
                zcat "$req.gz"
        echo HTTP/1.0 404 Not found
        echo Content-type: text/html
        echo "<H1>Can't find $req here!</H1>"
 (with `debdoc   stream  tcp     nowait  nobody  /usr/sbin/tcpd

 This is only for testing, but works fast..! A VERY small C program can do
this safely...
 And connections to that service could be restricted by default to the
local machine...

Nicolás Lichtmaier.-

TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
debian-devel-request@lists.debian.org .
Trouble?  e-mail to templin@bucknell.edu .

Reply to: