What is a robots.txt file?

In a nutshell. Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. ... The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
 
Robots is a catch all term for programs and automated scripts that crawl through the web and collect data from websites and anything else on the Internet that they can find. It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want.
 
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
 
Back
Top