Robots.txt is very important to control the indexing and crawling of search engines and other bots. Tell me what you think should be best practices for a robots.txt file. Should I allow every body to crawl the website or should I only allow certain hosts or search engines to visit my website...