[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: [gopher] Robots.txt



The GOGOPH crawler use robots.txt files

It use http://www.robotstxt.org/orig.html#format

You can also specify "Crawl-delay" value.

More info here => http://en.wikipedia.org/wiki/Robots_exclusion_standard

I think the crawler of Veronica follow robots.txt files.

2012/5/23 Nuno J. Silva <nunojsilva@ist.utl.pt>
On 2012-05-23, Kim Holviala wrote:

> On May 23, 2012, at 23:17 , Nick Matavka wrote:
>
>> Could I have the spec to, or perhaps somebody's explanation of, the
>> robots.txt file?  This will help me to explain it in my RFC.
>
> http://www.robotstxt.org/orig.html

(AFAIK, the gopher crawlers out there which respect robots.txt, and the
server administrators who have such files, are following the same format
used in the web.)

--
Nuno J. Silva (aka njsg)
gopher://sdf-eu.org/1/users/njsg
http://njsg.sdf-eu.org/

_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project



--
Damien CAROL
gopher://dams.zapto.org/1/
_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project

Reply to: