Hello Debian:
This
is Kaushal Kurapati from Ask Jeeves. I am a Senior Search Product Manager here
and wanted to speak to you about a crawler blocking issue. On bugs.debian.org,
we notice that there is a "disallow" directive in your
robots.txt that blocks our crawler from accessing pages on your site.
As you might know, Ask Jeeves
search reaches out to 29M users (according to ComScore, after our ISH acquisition) and in the overall online
space Ask Jeeves + Excite network is placed #6 in site traffic, right after
Ebay. Our goal is to provide the most relevant content to web searchers
and Debian.org
being a major open source/linux web
site, we are very interested in having access to the
relevant content on your site.
We
would like to request crawler access to your website. Currently our crawlers are
blocked by you through your Robots.txt file. Please let me know what you need to know from us to unblock our crawlers
(crawler/agent name etc?).
I
would greatly appreciate your help in this matter.
Thanks,
Kaushal -----------------------------------------------------------------------------------
Kaushal Kurapati
Sr. Product Manager
Ask Jeeves, Inc.
1501 So. Washington Drive, Piscataway, NJ
08854
[work]: 732-907-3016
[cell ]: 914-572-6157
email : kkurapati@askjeeves.com
|