[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: [gopher] gopher proxies



On Wed, Jul 24, 2013 at 10:56:25AM +0200, Jacob Dahl Pind wrote:
> user-agent: Lightspeed
> user-agent: SISTRIX Crawler
> user-agent: Baiduspider
> user-agent: YandexBot
> user-agent: Ezooms
> user-agent: Exabot
> user-agent: AhrefsBot

AhrefsBot is relentless. They don't seem to obey robots.txt, and
will just hammer a server for non-existent links.  So are you
suggesting that it's the proxy that should be responsible for
filtering this in some way?

  --Brian

-- 
Don't have gopher?  Visit the world's first wiki-based gopher proxy!
http://www.pongonova.org/gopherwiki
IRC: Freenode.net channel #gopherproject

_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project




Reply to: