What is the function of the Robots.txt file?

Hi! Robot.txt is a text file which has the syntax understandable by search engines. It allows search engine spiders to crawl your website.
 
Robot.txt file is a text file placed at the root of a website .
It contains instructions for bots crawling a website.
 
A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers.
 
Robots.txt file is used to give instruction to crawlers/searchbots. For example which website page/ directories we want to crawl, follow, index etc.

it's format is

User-agent: * (* means all search bots like Googlebot, scooter etc.)
Disallow: / (By Disallowing we are restricting searchbots not to follow any or all page or folder. For eg: wp-admin, wp-include etc.)
Allow: ( allowing means allow all searchbots to crawl and index all web pages)
 
Hi! Robot.txt is a text file which has the syntax understandable by search engines. It gives a command to the crawler on which page is to avoid and which page is to index.
 
Robot.txt is a text file by means of which search engine understand which part of your website should be crawled and which should not be.
 
Back
Top