[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: please use robots.txt for your gopher apps



On 05/23/2019 05:36 AM, Matt Owen wrote:
>> -----Original Message-----
>> From: Cameron Kaiser <spectre@floodgap.com>
>> Sent: 23 May 2019 01:29
Speaking only for V-2, and not for any other crawlers.

I think it would be a good idea for an informal RFC type document specifying how gopher crawlers should work - that way everyone can design to the same standard.

A sitemap file could just be a list of selector URIs, one per line. Keep it simple.



I think that is a very good idea. I've been toying with the idea of making a Gopher crawler of my own and it would be wonderful to have a robots.txt specification reference document (no matter how informal) to refer to. Likewise for sitemaps.

-Dave Gauer


Reply to: