Re: bug reports
On May 28, Craig Small (firstname.lastname@example.org) wrote:
> I think it is to do with robots.txt
> wget -r -l 1
> It nearly does what you want.
On May 28, Bastian Kleineidam (email@example.com) wrote:
> On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
> > I'd like to download the web page of bugs by maintainer,
> > http://firstname.lastname@example.org, and all
> > the bug reports linked to on that page, so that I can refer to them offline.
> > But, wget doesn't work,
> What's the error message? At least it works for me:
> # wget "http://email@example.com"
> - --13:53:00-- http://firstname.lastname@example.org
> => `email@example.com'
Thanks for the hints. I should have been more clear - I have no problem
getting the main page, i.e., firstname.lastname@example.org. There are
links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug=<num> for each
bug, and I want to get each of those as a local web page, too. That is the
part that seems to require more than a simple wget command.