[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#164421: [cjwatson@debian.org: Re: Bug#164421: debbugs: support intersecting sets of bugs (e.g. package and submitter)]



(Oops. Silly mutt.)

-- 
Colin Watson                                  [cjwatson@flatline.org.uk]

----- Forwarded message from Colin Watson <cjwatson@debian.org> -----

Date: Wed, 21 May 2003 21:09:51 +0100
From: Colin Watson <cjwatson@debian.org>
To: debian-debbugs@lists.debian.org
Subject: Re: Bug#164421: debbugs: support intersecting sets of bugs (e.g. package and submitter)
Mail-Followup-To: debian-debbugs@lists.debian.org
User-Agent: Mutt/1.3.28i

On Wed, May 21, 2003 at 02:20:55PM -0500, Drew Scott Daniels wrote:
> In bug #187064 I talk about changing the robots.txt file so that
> someone could use google to do searches, or to actually implement a
> text search.

master is already pretty overloaded, and every so often an uninvited
crawler gets in and sends the load sky-high. We really don't want to
open it up to legitimate crawlers as well.

-- 
Colin Watson                                  [cjwatson@flatline.org.uk]

----- End forwarded message -----



Reply to: