Bots, spiders and other crawlers hitting your dynamic pages can cause extensive resource (memory and cpu usage) and consequently high load on the server, and slow down your sites. How do you reduce server load from bots, spiders and other crawlers? By creating a "robots.txt" file at the root of your site/domain, you can tell search engines what content on your site they should and should not index. This can be helpful, for example, if you want to keep a portion of your site out of the Google search engine index.