easyarticles
New member
If website of website or Site owner wishes to give some instructions to search engine robots, then place a text file called robots.txt in the root of the website server. Instruction like which page to crawl and which is not crawl. And same for folders.
A robots.txt file covers one website. If you create many numbers of sub-domain, then you have to create robots.txt for each sub-domain. For ex. If you main website is xys.com and you create robots for xys.com. And you create one sub-domain d.xys.com. Then rule of xyz ( robots.txt instructions) is not applied for d.xys.com, you have to create the new robots.txt and upload in d.xys.com - location.
For complete information about Robots.txt visit Robots exclusion standard - Wikipedia, the free encyclopedia
A robots.txt file covers one website. If you create many numbers of sub-domain, then you have to create robots.txt for each sub-domain. For ex. If you main website is xys.com and you create robots for xys.com. And you create one sub-domain d.xys.com. Then rule of xyz ( robots.txt instructions) is not applied for d.xys.com, you have to create the new robots.txt and upload in d.xys.com - location.
For complete information about Robots.txt visit Robots exclusion standard - Wikipedia, the free encyclopedia