Tell Me some one about Spider and Crawlers ?

Spiders,crawlers,robots,bots have the same meaning.These are developed by google so that they search a website in order to check the links,content and all the other stuffs used within it.Since there are billions of websites over the internet.These crawlers or spiders crawl through websites at a faster speed.
 
Bots, spiders and other crawlers hitting your dynamic pages can cause extensive resource (memory and cpu usage) and consequently high load on the server, and slow down your sites. How do you reduce server load from bots, spiders and other crawlers? By creating a "robots.txt" file at the root of your site/domain, you can tell search engines what content on your site they should and should not index. This can be helpful, for example, if you want to keep a portion of your site out of the Google search engine index.
 
Both are same, that is a programs devised and used by search engines to collect data in online for providing results for search queries
 
Another name of Web Spider is Web Crawler.

Web crawlers/Web spider are mainly used -

1)to create a copy of all your visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches.
2)for automating maintenance tasks on a Web site, such as checking links or validating HTML code.
3)to gather specific types of information from Web pages, such as harvesting e-mail addresses .
 
Back
Top