What is a Web server or Crawling..?

Googlebot (or any search engine spider) crawls the web to process information. Until Google is able to capture the web through osmosis, this discovery phase will always be essential. Google, based on data generated during crawl time discovery, sorts and analyzes URLs in real time to make indexation decisions.
 
A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The crawler for the AltaVista search engine and its Web site is called Scooter.
A web server is a computer that runs websites. It's a computer program that distributes web pages as they are requisitioned.
 
Crawling is also known as spider or Google bot. It visits on the web pages and check that everything is ok in the site if it finds duplicity it removes that page or site. The search engine crawler visits each web page and identifies all the hyperlinks on the page, adding them to the list of places to crawl.
 
Back
Top