Today, Google, Yahoo!, and Microsoft have come together to post details of how each of them support robots.txt and the robots meta tag. While their posts use terms like “collaboration” and “working together,” they haven’t joined together to implement a new standard (as they did with sitemaps.org). Rather, they are simply making a joint stand in messaging that robots.txt is the standard way of blocking search engine robot access to web sites. They have identified a core set of robots.txt and robots meta tag directives that all three engines support: Google and Yahoo! already supported and documented each of the core directives, and Microsoft supported most of them before this announcement. In their posts, they also list the directives they support that may not be supported by the other engines. For robots.txt, they all support: Disallow Allow Use of wildcards Sitemap location For robots meta tags, they all support: noindex nofollow noarchive nosnippet noodpt With this announcement, Microsoft appears to be adding support for the use of * wildcards (which will go live later this month) and the Allow directive. The biggest discrepancy is with the crawl-delay directive. Yahoo! and Microsoft support it, while Google does not (although Google does…
- Amazon Echo, Google Nest and all the best smart home gifts of 2019
- Technology: Road Tripping with Google Maps
- Rise of the machines: Robots shoot hoops
- A Robot’s View: Some perspective!
- Brazil says new robots to be used in Confed Cup
- Robots to provide World Cup security
- NCAA clarifies Boeheim’s comments on report
- Former UEFA boss: game is for `humans, not robots’
- Missouri AD offers qualified support of Haith
- Martinez to clarify future plans
Yahoo!, Google, Microsoft Clarify Robots.txt Support have 291 words, post on at June 2, 2008. This is cached page on SEO. If you want remove this page, please contact us.