Fwd: Problems with indexing your website content.
Hi Don & Blars,
could you please look into that?
Martin Zobel-Helas <firstname.lastname@example.org> | Debian System Administrator
Debian & GNU/Linux Developer | Debian Listmaster
GPG key http://go.debian.net/B11B627B |
GPG Fingerprint: 6B18 5642 8E41 EC89 3D5D BDBB 53B1 AC6D B11B 627B
--- Begin Message ---
I’m a representative of the leading search engine in Russia, Yandex LLC. (http://www.yandex.com <http://www.yandex.com> ). We think that the content of your site is very important and would be very useful for the users of our search engine system. But at this moment our crawler (User-agent: Yandex) is not allowed to index your site bugs.debian.org (in robots.txt). We would be grateful if you let us know if our crawlers violate any of your policies. What are the reasons for blocking them? What can we do to get access for our crawlers to index your content?
Once your site has been indexed by Yandex, it will appear in relevant search results not only on yandex.com (with users all over the world), but also in our national searches (tens of millions of users from Russia, Ukraine, Kazakhstan and Belarus will be able to find your site). Yandex is Russia's largest and world's seventh largest search engine and web portal with a workday audience of more than 19 million unique visitors (as of May 2010) from all over the world.
Sincerely yours, Platon
Yandex customer support
--- End Message ---