Google Sitemaps ?

alexjorge

New member
Google Sitemaps is an experiment in Web crawling by using Sitemaps to inform and direct Google search crawlers. Webmasters can place a Sitemap-formatted file on their Web server which enables Google crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly.
 
This sounds interesting. Does anyone here have any experience with this? It apparently helps with the google bots and i am wondering if it will help with other crawlers like from yahoo bing. Question is if google will only crawl those pages or will it continue to search for others not listed on the sitemap.
 
Google site-maps?
>google sitemaps are XML files that list the urls available on a site.
> you should try to make all pages on your site easily accessible to search engines without the use of google sitemaps.
>the aim is to help site owners notify search engines about the urls on a website that are available for indexing.
 
The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.

Sitemaps are particularly beneficial on websites where:

some areas of the website are not available through the browsable interface
webmasters use rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.
The site is very large and there is a chance for the web crawlers to overlook some of the new or recently updated content
When websites have a huge amount of pages that are isolated or not well linked together, or
When a website has few external links
 
Last edited:
A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content. Search engine web crawlers like Googlebot read this file to more intelligently crawl your site.

Also, your sitemap can provide valuable metadata associated with the pages you list in that sitemap: Metadata is information about a webpage, such as when the page was last updated, how often the page is changed, and the importance of the page relative to other URLs in the site.
 
Hello alexjorge ,

Thank you for sharing this valuable information.
I think Search engines are quite smart in finding a sitemap. Whenever you publish new content a ping is sent to Google and Bing to inform them about changes in your sitemap. However, we recommend that you submit your sitemap to Google by using Google Webmaster tools. The benefit of this is that Google will now show you any errors, pages indexed, and other relevant stats that will be helpful to you as a webmaster.
 
The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.

Sitemaps are particularly beneficial on websites where:

some areas of the website are not available through the browsable interface
webmasters use rich Ajax, Silverlight, or Flash content that is not normally processed by search engines.
The site is very large and there is a chance for the web crawlers to overlook some of the new or recently updated content
When websites have a huge number of pages that are isolated or not well linked together, or
When a website has few external links
 
Back
Top