[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: [gopher] Torrent update



On 3.5.2010 14:19, Martin Ebnoether wrote:

PLEASE set up robots.txt to prevent robots from re-archiving this stuff!
I don't want to end up with 20 copies of the 30gig archive!

Put "robots.txt" in your gopher root with something like this in it:

User-agent: *
Disallow: /archives

Does this really work for gopherspace?

Yup.

Besides, are there any popular search engines that index
gopherspace? Google[1] does not, neither does Bing.

Popular... *cough*, Veronica-2 is pretty popular and it respects robots.txt.

I'm building my search engine which will be have much broader scope but it's not finished yet. Crawler works, and I have crawled through maybe 50% of the gopherspace, but the indexer and search are still works in progress.


- Kim

_______________________________________________
Gopher-Project mailing list
Gopher-Project@lists.alioth.debian.org
http://lists.alioth.debian.org/mailman/listinfo/gopher-project




Reply to: