[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: [gopher] gopher proxies



S
On Jul 24, 2013 1:15 PM, "Brian Koontz" <brian@pongonova.net> wrote:
>
> On Wed, Jul 24, 2013 at 10:56:25AM +0200, Jacob Dahl Pind wrote:
> > user-agent: Lightspeed
> > user-agent: SISTRIX Crawler
> > user-agent: Baiduspider
> > user-agent: YandexBot
> > user-agent: Ezooms
> > user-agent: Exabot
> > user-agent: AhrefsBot
>
> AhrefsBot is relentless. They don't seem to obey robots.txt, and
> will just hammer a server for non-existent links.  So are you
> suggesting that it's the proxy that should be responsible for
> filtering this in some way?

Some of them already do. The floodgap proxy, for instance, has a blackhole for bots to wander about in.
>
>   --Brian
>
> --
> Don't have gopher?  Visit the world's first wiki-based gopher proxy!
> http://www.pongonova.org/gopherwiki
> IRC: Freenode.net channel #gopherproject
>
> _______________________________________________
> Gopher-Project mailing list
> Gopher-Project@lists.alioth.debian.org
> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project

_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project

Reply to: