[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Documentation stuff



>  I really want the glimpse searching that TkMan has, but within the
> XEmacs interface.  `dwww' has it, but for some reason it does not find
> as many manual entries as Tkman does for the same search.  I wonder
> why?  Perhaps a generalized perl script (or pull the tcl out of tkman
> that does it?) could do the search, and spit out the links for XEmacs
> or dwww to parse and display?

I'm changing the way dwww is put together, so any type of searching
can be added.  The way searching works right now isn't great.  Remember, 
dwww is still a work in progress.

>  I'm using W3 now, so html isn't that bad an option.  I can still have
> almost everything inside the editor interface that way.  I really love
> having `webster-www' bound to {f2}, so I can look up a word in a
> really nifty fashion.

Once dwww matures a bit, it would be nice to have an emacs interface
to it (via w3-el).  That should be easy to do.
 
>  It occured to me today that it would be good to have an rfc index,
> too.  Maybe it would have the <a href...>'s link through a cgi script
> that would check for a local copy, then go get a remote one if the
> local one's not around?  Perhaps it could cache them?

I'm going to build a dwww-dev package which will make it easy to
build indexes for specific packages that work with dwww and it's
upcoming documentation menu.  It's going to work just like you say.
 
>  I've got the doc-rfc package installed.  `dwww' might call on a
> module for searching that someday, perhaps.

Perhaps sooner than you think.  :-)
 
>  Gee, maybe HTML should support alternative URL's?  The first try to
> the local copy, if that's not there, then call out to a server on the
> net.  There could be <META-html> style variables in the markup to set
> up the base directories/servers.

We had the exact same discussion on the debian-doc mailing list.
 
>     Jim> 4) HTML documentation, if it exists, should be gzipped.  Lynx
>     Jim> and Netscape can handle the compressed files, provided that
>     Jim> the links are straightened out using a tool like fixhrefgz.
> 
>  Can't apache do that?  I think there's a mod-rewrite that will do
> what we need.  Though I suppose not everyone runs apache...  You tell
> me and we'll both know.  I think it's a good idea to have a
> light-weight server that can launch from xinetd.

The only way to straighten out the links is to change the contents of
the web page.  dwww does this (sort of).  I think mod-rewrite only
works on the requested URL, not the URL's in the document.  So I
don't think Apache can do this by itself.  

Anyways, I think using a tool like fixhrefgz to fix the links in the 
source document is required if we compress docs, since it's a nice
capability of being able to surf the documents straight off the 
hard drive, without using a web server.

As for this lightweight server stuff - I tried all the web servers
when I was testing dwww - and Apache was probably the least resource
intensive and the fastest.  Of course, it was the most configurable 
too - maybe that's why it's called a "heavyweight".

Cheers,

 - Jim


Attachment: pgptEwmcPtV21.pgp
Description: PGP signature


Reply to: