[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: please use robots.txt for your gopher apps

On Thursday 16 May 2019 14:13,
Cameron Kaiser <spectre@floodgap.com> put forth the proposition:
> I love gopher apps and seeing them, but it is very hard for V2's robot to
> automatically recognize them, requiring lots of manual work to pull stuff
> out of the index that should never have been there in the first place. Please
> use a robots.txt selector to keep the V2 robot out of these areas; I'm
> considering a policy requirement that sites to be accepted to the new servers
> page must have some sort of robots.txt up since this is becoming a (happy)
> problem.
> --
> ------------------------------------ personal: http://www.cameronkaiser.com/ --
>   Cameron Kaiser * Floodgap Systems * www.floodgap.com * ckaiser@floodgap.com
> -- Remember, kids: for great justice take off every zig! ----------------------

I can't really see everyone adding a robots.txt to be honest.

How about:

1) If there are parts that you don't want crawled, then use a robots.txt
   to define those, or:

2) The entire gopherhole will be crawled.

That seems more reasonable to me.



If you're not part of the solution, you're part of the precipitate.

Reply to: