[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: added robots.txt disallowing everything



Hi,

I'm going to contact the google people to see if there is a way to make the
googlebot be more nice when indexing the lists archives. I'm don't
think that shutting down all the search engines is a good solution.

What do you think?

Bye
Cesar Mendoza
http://www.kitiara.org
--
"The three golden rules to ensure computer security:
Do not own a computer, do not power it on, and do not use it."
  --Robert T. Morris

On Wed, Nov 08, 2000 at 01:53:54AM +0100, Josip Rodin wrote:
> Hi,
> 
> FYI I've added a robots.txt file in the root directory of lists.debian.org
> HTMLs, containing:
> 
> User-agent: *
> Disallow: *
> 
> Hopefully it will fix googlebot and other web crawlers that seem to have
> noticed the lists archives and which tend to DoS the machine... :/
> 
> -- 
> Digital Electronic Being Intended for Assassination and Nullification
> 
> 
> --  
> To UNSUBSCRIBE, email to debian-www-request@lists.debian.org
> with a subject of "unsubscribe". Trouble? Contact listmaster@lists.debian.org
> 




Reply to: