What are Spiders, Robots and Crawlers and what are their functions?

These three are same.
They are programs which search engines use to crawl web pages for indexing.
 
Search Engine use programmed software to find and crawl available webpages, that is known as Spiders, Robots, web crawlers.
 
Spiders, Robots and Crawlers all are same these are automated software programme search engine use to stay up to date with web activities and finding new links and information to index in their database. Search engines need to keep their database updated so they created some automated programmes which goes from site to site and find the new data for search engine also collects the information about the web page what is the page all about.
 
Back
Top