[OT] robots.txt creation script?
Hi all!
I've recently developed an interest in preventing spiders from accessing
certain areas of my site ... but as near as I can tell, robots.txt is
pretty stupid. It only lets you *disallow*, whereas it would be a lot
more sensible for me to specify what I want to *allow*.
I was thinking I might hack up a little script to generate a robots.txt
file that disallows everything except the files I've listed, but first,
has anyone already done this or seen this done? I'd hate to reinvent
the wheel =)
In my ideal world, robots.txt wouldn't require you to call out all of
the "hidden" directories on your site ... *sigh* ...
--
monique
Reply to: