Robots.txt is a text file you include on your site to inform search robots that which pages should be crawled or not.
Your Robots.txt file is what tells the search engines which pages should be accessed and indexed on your website. For example, if you specify in your Robots.txt file that you don’t want the search engines to crawl a page, that page will not be shown in search engine result page and web users won’t be able to find it. Keeping the search engines from accessing certain pages on your site is essential for both the privacy of your site and for your SEO.
Your Robots.txt file instructs these programs not to search pages on your site which you mention using a “disallow” command. For example, the following Robots.txt command:
User-agent: *
Disallow: /quiztest
Disallow: /images
Disallow: /asp/demo_db_edit.asp
Disallow: *.htm
Disallow: *.html
Disallow: *.aspx$
Disallow: *.php$
…would prevent all search engine crawlers from crawling the following page on your website: