On 14.9.2011 14:22, Jacob Dahl Pind wrote:
As we cant ratelimit, or any other robots.txt things when those web crawlers are hitting us through those proxies, I have opped for simply redirecting access from those proxies to another selector.That's actually a pretty good feature, I might blatantly steal it for the upstream version, without attribution.Had wanted to send it to you, but started looking at having it using hostnames also, am just a bit worried about having it resolving on each connection.
I was thinking about Apache-like Allow/Deny directives, but those would have required a config file instead of just simple options. And parsing the config file really works only with proper daemons, not with inetd-type services which keep launching themselves over and over again.....
At the same time I was also thinking about doing password-protected directories. Easy enough to do even with the original protocol, but I'm not sure if anyone would find those useful...
- Kim _______________________________________________ Gopher-Project mailing list Gopher-Project@lists.alioth.debian.org http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project