[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: please use robots.txt for your gopher apps

Cameron Kaiser <spectre@floodgap.com> wrote:

> > > > Then I misread it I guess.
> > > 
> > > I've got a middle ground asking for this to be done voluntarily on the new
> > > list, and we'll see how well that works.
> > 
> > You are planning a new mailinglist?
> Sorry, I should have clarified. By "new list" I mean the list at
> 	gopher://gopher.floodgap.com/1/new
> of new servers.

So, for individual gopher spaces on multi-user hosts such as sdf.org, do you
suggest multiple robots.txt files?  Certainly there are many such spaces that
have things like custom CGI scripts that probably shouldn't be crawled.


Reply to: