[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Bug#164421: debbugs: support intersecting sets of bugs (e.g. package and submitter)



On Wed, May 21, 2003 at 02:20:55PM -0500, Drew Scott Daniels wrote:
> In bug #187064 I talk about changing the robots.txt file so that
> someone could use google to do searches, or to actually implement a
> text search.

master is already pretty overloaded, and every so often an uninvited
crawler gets in and sends the load sky-high. We really don't want to
open it up to legitimate crawlers as well.

-- 
Colin Watson                                  [cjwatson@flatline.org.uk]



Reply to: