Matt Taggart wrote:
http://bugs.debian.org/robots.txteffectively preventing such searches. I'm sure there is a good reason that restriction was added, but it's pretty universal. If it was a case of abuse, maybe we can loosen it up so that well behaved search engines are allowed?
The reason is that all the bug pages are dynamically generated, have a current timestamp and thus appear (slightly) different each time they're looked at, and there are a reasonably large number of different urls for the exact same bug/package -- all of which mean web spiders tend to cause significant BTS pain.
Cheers, aj