What is robots.txt

Robots.txt is a text file webmasters create to instruct web robots "how to crawl pages on their website". The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.
 
Greetings!

In simple words, robots.txt is a text file in which there are instructions for search engines and thier bots (Google/Bing/Yandex etc.) the pages of your websited property to be crawled and indexed.

There are certain directories and files that you dont want search engines to crawl and index.

Although, even in absence of robtots.txt all your website property is crawled by search engines.

It is better to have a clear robots.txt in place to guide search engine bots the website property you want to get indexed and appear in search engine result pages (SERP).

Regards,

Prashaant kd

365 digital marketing
 
Back
Top