The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.
Sitemaps are particularly beneficial on websites where:
some areas of the website are not available through the browsable interface
webmasters use rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.
The site is very large and there is a chance for the web crawlers to overlook some of the new or recently updated content
When websites have a huge amount of pages that are isolated or not well linked together, or
When a website has few external links