X robots tag is a powerful tool for controlling search engine bots. X-robot tag allows a great deal of flexibility and full control of search engine bots crawling your website's content- even more than when using robots.txt and the meta robots tag.
The X-Robots-Tag can be used as an element of the HTTP header response for a given URL and It allows you to hide pages, sub-domains or any content from a search engine spider.
The X-Robots-Tag can be used as an element of the HTTP header response for a given URL. Any directive that can be used in an robots meta tag can also be specified as an X-Robots-Tag
robots.txt file (or the absence of one) has given permission to crawl a page, by default pages are treated as crawlable, indexable, archivable, and their content is approved for use in snippets that show up in the search results, unless permission is specifically denied in a robots meta tag or X-Robots-Tag