Googlebot is a Google automated program that is responsible to read through webpage source and provide information to search engines. They are used to cache and index webpages in search engines.
Googlebot is Google's web crawling bot (sometimes also called a "spider"). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. We use a huge set of computers to fetch (or "crawl") billions of pages on the web.
A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "spider" or a "bot."
Google bots are nothing but software programs which are designed to crawl, index, rank and return websites. You can expect them to perform several important tasks on web.