Robots.txt file instructs the crawlers to crawl and disallow certain webpages in a website. It basically lists the webpages to control the crawling of the spiders.
Robot.txt is the file that communicates with the search engine crawler on your website. It tells the search engine about your website and tells the search engine where and how to crawl on the website.