what is robots.txt file?

Robots is a catch all term for programs and automated scripts that crawl through the web and collect data from websites and anything else on the Internet that they can find. It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want.
 
"Before a search engine crawls your site, it will look at your robots.txt file as instructions on where they are allowed to crawl (visit) and index (save) on the search engine results. Robots.txt files are useful: If you want search engines to ignore any duplicate pages on your website." Basically, robots.txt is intended for bots.
 
Back
Top