robots.txt is blocking all spiders

Robots.txt is a text file which enlists all webpages that you want to restrict from search engine spiders. These pages are human visible but are never crawled by Google spiders and hence never cached and indexed.
 
Back
Top