[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: bug reports



On May 28, Craig Small (csmall@enc.com.au) wrote:
 > I think it is to do with robots.txt
 > Try 
 > wget -r -l 1
 > http://bugs.debian.org/cgi-bin/pkgreport.cgi?maint=neil@deban.org
 > 
 > It nearly does what you want.

On May 28, Bastian Kleineidam (calvin@debian.org) wrote:
 > On Wed, May 28, 2003 at 06:49:50AM -0400, Neil Roeth wrote:
 > > I'd like to download the web page of bugs by maintainer,
 > > http://bugs.debian.org/cgi-bin/pkgreport.cgi?maint=neil@debian.org, and all
 > > the bug reports linked to on that page, so that I can refer to them offline.
 > > But, wget doesn't work,
 > What's the error message? At least it works for me:
 > # wget "http://bugs.debian.org/cgi-bin/pkgreport.cgi?maint=neil@debian.org";
 > - --13:53:00--  http://bugs.debian.org/cgi-bin/pkgreport.cgi?maint=neil@debian.org
 >            => `pkgreport.cgi?maint=neil@debian.org'

Thanks for the hints.  I should have been more clear - I have no problem
getting the main page, i.e., pkgreport.cgi?maint=neil@debian.org.  There are
links in that page to bugs.debian.org/cgi-bin/bugreport.cgi?bug=<num> for each
bug, and I want to get each of those as a local web page, too.  That is the
part that seems to require more than a simple wget command.

-- 
Neil Roeth



Reply to: