Robots.txt is the robot protocol which is used to tell the robot to crawl your website and web pages. It is stored as a text file to access and even able to tell the robot to which subpages to crawl and which subpages not to crawl.
Robots txt is the protocol used by websites to communicate with the website crawler or robot. This robot txt gives the crawler a description of your website and its content.
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
The robots.txt file is a text file that tells web robots (most often search engines) to crawl on the pages of your websites. The slash after “Disallow” tells the robot to not visit any pages on the site.