Google, in its own words, uses a huge set of computers to crawl billions of pages on the web. This crawler, called the Googlebot, essentially begins with a list of web page URLs generated from previous crawls and then augments those pages with sitemap data provided within Google Search Console