what is spiders,crawlers,bots?

Spider- A spider is a program run by a search engine to build a summary of a website’s content (content index). Spiders create a text-based summary of content and an address (URL) for each webpage.

Crawling- Crawling takes place when there is a successful fetching of unique URIs which can be traced from valid links from other web pages.

Bots - It had automated computer program can visit websites. It will guided by search engine algorithms It can combine the tasks of crawler & spider helpful of the indexing the web pages and through the search engines.
 
Back
Top