[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#67637: www.debian.org: robots.txt has too many entries



Previously Nicolás Lichtmaier wrote:
>  Current /robots.txt prohibits indexing of many resources that should be
> indexed.

Probably because search engines would overload www.debian.org otherwise.

Wichert.

-- 
  _________________________________________________________________
 / Generally uninteresting signature - ignore at your convenience  \
| wichert@wiggy.net                   http://www.liacs.nl/~wichert/ |
| 1024D/2FA3BC2D 576E 100B 518D 2F16 36B0  2805 3CB8 9250 2FA3 BC2D |

Attachment: pgpJuUH3Xqcmb.pgp
Description: PGP signature


Reply to: