What is Robot.txt, and importance of it ?

Robots.txt is text file used to instruct search engines how to crawl your website, via robots file you can tell search engine how to crawl your website, you can set delay time, block particular pages, block particular search engines. in short it is a engine of website. because at the beginning search engine looking for this file. If they dont find robots file on your server, they are free to crawl anything on your website.
 
The robots.txt is a simple text file in your web site that informs search engine bots how to crawl and index website or web pages. It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want.
 
Robots.txt is a simple text file webmasters create to instruct robots how to crawl & index pages on their website and that is placed in your website's root directory in order to tell the search engines which pages to index and which to no index.
 
Back
Top