Robots.txt is a text file in your website that defines webpages instructing search engine spiders to crawl and disallow them. It is helpful to control the crawling activity of your website.
robots.txt file give instruction about their website to web indexing. It's called Robots Exclusion Protocol. Robots.txt file you put on your website to tell crawler which page would you like to crawl.