Spider, Robots and Crawlers

Search engine spiders, "robots" or "web crawlers" are small, automated software programs used by every search engine company. Think of the internet as a world-wide spiderweb, just like one in a dusty corner. Visualize the "spiders" that "crawl" across all the interconnected links, moving quietly, disturbing nothing, but visiting each and every corner of that spider web. ( Spiders, the world wide web, crawling and robots.... now you see how these "technical" terms evolved. Computer geeks are great at explanatory analogies.) No matter what they are called, the assigned job is to constantly roam around the "world wide web" searching for new or updated / changed web pages. There are thousands + new pages added a day to the world wide web.
A search engine spider is only a service tool.
Their function is to help the search engines index or "catalog" every web site correctly. The content they come across are indexed in their massive search indices. The webpages that are indexed, are grouped according to their authority, trust, high-quality content and the number of backlinks
 
They are all the same and are different names of a search engine automated program that is responsible to read through webpage source and provide information to search engines. They are helpful to crawl and index webpages in search engines.
 
Bots are programs used commonly on the web for quick indexing, collecting data and information. Broken down to its core element, bots are a software programmed to perform either legitimate or malicious automated tasks quicker and more efficiently than humans can. It is a fact that three out of five visitors to your site is a bit and as studies suggest they are dominating web traffic.

A search engine crawler is a program or automated script that browses the World Wide Web in a methodical manner in order to provide up to date data to the particular search engine. While search engine crawlers go by many different names, such as web spiders and automatic indexers, the job of the search engine crawler is still the same.

A search engine spider is a program that works behind the scenes to deliver up-to-date Web search results to users through a search engine. The search engine spider can work continuously or respond to user events. Typically, the search engine spider uses specific and ordered methods to scan Web text and index pages for their search engine rankings.
 
Back
Top