Robots.txt what exactly is it?

Your Robots.txt file is what tells the search engines which pages to access and index on your website on which pages not to. For example, if you specify in your Robots.txt file that you don’t want the search engines to be able to access your thank you page, that page won’t be able to show up in the search results and web users won’t be able to find it. Keeping the search engines from accessing certain pages on your site is essential for both the privacy of your site and for your SEO.
 
Robots.txt is a text file in your web hosting files. Search engine see first, which page crawl or which not.

In Robots.txt file you can allow or disallow the page you want.
Example:
User-agent : *
Disallow: /
Allow: /
Put your URL in Disallow and allow to crawl or not crawl your page
 
Last edited:
Back
Top