[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Bug#949173: packages.debian.org: robots.txt doesn't actually block anything



On Fri, Jan 17, 2020 at 6:57 PM Adam D. Barratt wrote:

> which is effectively the same as allowing everything. "Disallow: /"
> might be more logical, unless there is a desire / requirement to allow
> crawling and indexing of (parts of) the site.

I expect we want to allow crawling the site, all of the pages are
public and most of them are useful for search engines to index.

-- 
bye,
pabs

https://wiki.debian.org/PaulWise


Reply to: