[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#67637: acknowledged by developer (www.debian.org: robots.txt has too many entries)

> > This can be easily checked.. are there any log analisis that has shown
> > this?
> A few minutes ago master suffered a DoS (sort of, the load was >80 and you
> couldn't do anything) by googlebot which was accessing all the bug reports
> and stuff, because the robots.txt file was missing on klecker (it was
> forgotten during the move).
> I've put the file back on klecker, and removed the obsolete entries (i.e.
> the files that don't exist), but I'm definitely leaving Bugs/ and Packages/
> in there so that stuff like this doesn't happen anymore.

 The Packages pages were expressly created for being scanned by indexers.
You should complain to Google instead of removing the pages from one of the
most useful resources the web has nowadays.

 ( Everything IMO... =) )

Reply to: