Re: Next approach: Documentation Policy
On Tue, 1 Jul 1997, Fernando wrote:
> Christian Schwarz wrote:
> > No, I talked about the "*.info" files. Someone else said here, some time
> > ago, that the intention of a Linux distribution is to provide the people
> > with "compiled files", not with the source that everyone has to compile.
> > Some people have slow computers, little memory, etc. and they surely don't
> > want to compile these files themselves all the time (whenever a package is
> > installed, upgraded, etc.).
> So why we are still distributing the man sources instead of pre-formatted
> man pages?
That's a good question! (I don't know why.)
> Please note that makeinfo (the info compiler) is much faster than groff (the
> man compiler).
But the "texi2html" compiler is slow!
> And you did not answer my other concerns about not distributing documents
> in source format. I summarize them and add a few points:
> - The speed of compiling sources is reasonable enough in most cases.
No. Some people have slow computers with little manual and they couldn't
afford recompiling the docs whenever the upgrade/install a package.
> - On demand compiling works for man, it should work for info too.
Perhaps we should reconsider shipping compiled manual pages, then.
> - On-the-fly conversion to html is good enough in many cases.
No. That's why I started this discussion. The result of "info2www" is not
> - Makeinfo is fast, groff is slow.
texi2html is slow, too. (And other converters, from other mark up formats,
e.g. SGML, may be much slower!)
> - Users may want to print the documents. They need the source for that.
We can easily provide PostScript documents via our ftp server.
> - Users may want to automatically process the sources or convert them
> to a non-standard format.
Right, but how often will it happen, that someone needs a format other
> - Texinfo files are nicer when viewed with the texi2html converter.
Sure. That's why on-the-fly compilation is not possible. (texi2html does
not work as CGI script, AFAIK--and it's much to slow anyways)
> Did you read my alternative proposal? I think it issues important points
> not covered in the present one, like treatment of binary documents, fallthrough
> methods for man and info when not installed and (IMHO) a nice and consistent
> solution to the problem of when to include multiple formats in a package.
> Even if you don't agree with everything I still think you should consider
> some parts of it.
I just read it again. You mention a few additional aspects (dpkg support
to unpack doc only, for example) which we should consider _after_ the main
points have been decided.
> One problem I see with the texi2html program is that it depends on Perl, which
> makes it very slow in an old computer. I was trying to modify it to do
> on-the-fly conversion but I have switched to makeinfo, which is very fast
> even in my old 386. I am adapting makeinfo to produce on-the-fly html output
> instead of info output.
Ok, this would change things a bit.
-- Christian Schwarz
Do you know firstname.lastname@example.org, email@example.com,
Debian GNU/Linux? firstname.lastname@example.org, email@example.com
Visit PGP-fp: 8F 61 EB 6D CF 23 CA D7 34 05 14 5C C8 DC 22 BA
TO UNSUBSCRIBE FROM THIS MAILING LIST: e-mail the word "unsubscribe" to
firstname.lastname@example.org . Trouble?
e-mail to email@example.com .