[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#67637: www.debian.org: robots.txt has too many entries



Package: www.debian.org
Version: 20000724
Severity: normal

 Current /robots.txt prohibits indexing of many resources that should be
indexed.

User-agent: *
Disallow: /Bugs/
Disallow: /Lists-Archives/
Disallow: /Packages/
Disallow: /security/
Disallow: /news.html
Disallow: /consultants.html
Disallow: /consultant_info/
Disallow: /people.html
Disallow: 

 I don't see any reason to have *any* of these entries. But even so:
/Packages/ and /Lists-Archives/ are completelly out of place here (perpetual
URLs pointing to useful, indexable content).



Reply to: