What is robots.txt

Robots.txt is the file where you can block which page you want to index or which is no need to index u can make it from robots.txt.

we are informing to search engines that which page needs to index, which page is not to index in search engine result. when ever search engine bots reads your website first it will fine robotos.txt, So, that we can contor via there.
 
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
 
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.
 
Back
Top