Robots.txt is text file used to instruct search engines how to crawl your website, via robots file you can tell search engine how to crawl your website, you can set delay time, block particular pages, block particular search engines. in short it is a engine of website. because at the beginning search engine looking for this file. If they dont find robots file on your server, they are free to crawl anything on your website.