Google Updates Robots.txt Policy: Unsupported Fields Are Ignored

Emily

New member
  • Google has stated that its spiders do not support fields other than those noted in its robots.txt guidelines.
  • This is one step of the clear directives that Google is giving to website owners and developers.
  • Based on this update, any ambiguity should be eliminated and a website cannot count on unsupported directives.
What This Mean to be,

Stick to Supported Fields:
Use only the fields that are explicitly mentioned in the Google documentation.

Review Existing Robots.txt Files: Review existing robots.txt files to make sure they don't contain unsupported directives.

Know Your Limits: Google's crawlers do not know a third-party or custom directive.

Supported Fields:

According to the new document, "the following fields are supported by Google as recorded in official records in robots.txt:"

  • user-agent
  • allow
  • disallow
  • sitemap
  • Important Exemptions
Although this is not explicitly mentioned, it indicates that Google does not support "crawl-delay", which is one of the most popular directives, though other search engines understand it. Of course, Google also phases out support for the 'noarchive' directive.
 
Back
Top