what is spiders,crawlers,bots?

Spider - The browsers are like a program and to download the web page.

Crawler – The program is automatically to follow the links are web page..

Robots - It had automated computer program can visit websites. It will guided by search engine algorithms It can combine the tasks of crawler & spider helpful of the indexing the web pages and through the search engines.
 
Spider- A spider is a program run by a search engine to build a summary of a website’s content (content index). Spiders create a text-based summary of content and an address (URL) for each webpage.

Crawling- Crawling takes place when there is a successful fetching of unique URLs which can be traced from valid links from other web pages.

Bots - Basically bots are computer program can visit websites. It will take instructions from Google algorithms & It can conclude the process of crawler & spider, which is very helpful of the indexing of the web pages.
 
Last edited:
Googlebot, Yahoo Slurp, and MSNbot and similar spiders, bots, and crawlers are the programs that harvest information for search engines. For anyone tracking statistics on their website, Googlebot, MSNbot, and Yahoo Slurp can be welcomed guests.
 
A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "spider" or a "bot."
 
Back
Top