Robots.txt is a content document which permits a site to give guidelines to web creeping bots.
Internet searchers like Google utilize these web crawlers, now and then called web robots, to chronicle and sort sites. Mosts bots are arranged to hunt down a robots.txt record on the server before it peruses whatever other document from the site. It does this to check whether a site's proprietor has some uncommon guidelines on the most proficient method to slither and record their site.