[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: please use robots.txt for your gopher apps



> > Sorry, I should have clarified. By "new list" I mean the list at
> >
> > 	gopher://gopher.floodgap.com/1/new
> >
> > of new servers.
> 
> So, for individual gopher spaces on multi-user hosts such as sdf.org, do you
> suggest multiple robots.txt files?  Certainly there are many such spaces that
> have things like custom CGI scripts that probably shouldn't be crawled.

I don't know how to solve that problem, no. I guess it never got solved
for HTTP either, but that doesn't matter since everyone just gets a vhost.
I'm open to reasonable suggestions that wouldn't overly complicate the bot.

-- 
------------------------------------ personal: http://www.cameronkaiser.com/ --
  Cameron Kaiser * Floodgap Systems * www.floodgap.com * ckaiser@floodgap.com
-- Careful with that Axe, Eugene. -- Pink Floyd -------------------------------


Reply to: