[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: [gopher] Joining in: I'm the maintainer/host of Gopher Proxy



Evert, nice to see you here. Thanks for everything on meulie.net; it's a
good treasure trove. I see Veronica-2 is busily reindexing it as we speak.

> Hmm, blocking all spidering will keep search engines from indexing 
> Gopher-servers, some of which contain a treasure of documents from times 
> gone by.
> 
> How about I only let Google in? (Does anyone use any other search engine 
> anyway?)

This seems like a reasonable compromise to me personally, though for the
record the Floodgap proxy has <meta> tags up to tell all robots not to
index, even Google, and IPs or user agents that demonstrate spidering activity
wind up in a blacklist when I discover them (my custom log-analysis software
highlights unusually high levels of activity).

But, as long as some enforcement were run on the other end to deal with bots
that ignore robots.txt, speaking with my gopher server operator hat on I'd
be fine with letting Google still continue to index through.

> All credit goes to Stephen Morley. I merely took over his creation after 
> he decided to discontinue hosting.

I'm glad to see you took it over. Stephen and I didn't get along so well on
proxy policy, as you may have seen from previous messages, but he did a very
good job designing it (much better than my strictly utilitarian effort).

Any decision "the proxy cartel" comes to should also involve the pongonova
proxy (Brian Koontz) since his is probably #3 in the traffic hierarchy.

-- 
------------------------------------ personal: http://www.cameronkaiser.com/ --
  Cameron Kaiser * Floodgap Systems * www.floodgap.com * ckaiser@floodgap.com
-- The early bird may get the worm, but the second mouse gets the cheese. -----

_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/gopher-project




Reply to: