Emily
New member
- Google has stated that its spiders do not support fields other than those noted in its robots.txt guidelines.
- This is one step of the clear directives that Google is giving to website owners and developers.
- Based on this update, any ambiguity should be eliminated and a website cannot count on unsupported directives.
Stick to Supported Fields: Use only the fields that are explicitly mentioned in the Google documentation.
Review Existing Robots.txt Files: Review existing robots.txt files to make sure they don't contain unsupported directives.
Know Your Limits: Google's crawlers do not know a third-party or custom directive.
Supported Fields:
According to the new document, "the following fields are supported by Google as recorded in official records in robots.txt:"
- user-agent
- allow
- disallow
- sitemap
- Important Exemptions