Getting Going With Google Sitemaps

Everyone knows about sitemaps.

Traditionally, it’s a separate area where you include links to every public page on your website.

Sometimes they include brief descriptions of the different pages and the content they contain.

Sometimes they are nothing more than a long and somewhat generic list of page links. Some people create sitemaps with the sole purpose of giving their viewers a comprehensive web page directory. Some people create sitemaps simply to make certain the search engine crawlers find each and every available page on their website. And then came Google Sitemaps…

Like all search engine crawlers, GoogleBot is out there with the express purpose of gathering valuable data that can be added to its searchable index.

The sooner it can return with new and updated information the better. For both Google and the people who use their search engine. With that in mind, the Google sitemap service offers a twofold solution. First, it lightens GoogleBot’s burden of having to constantly crawl the same places over and over again looking for new and updated content. Now, with a system that tells the bot when and where to crawl, the result is simply a great deal of time being saved.

Spend Time Wisely

Time that can be spent much more efficiently. Rather than waste time on pages that have not been (and might never be) updated or changed, the bot can zero in on places that have valuable and current content that can be added to the search database. For webmasters, Google Sitemaps offers a way to send immediate notification when any change or addition takes place within their websites. This not only increases the possibility of getting pages indexed faster, it ensures that GoogleBot can easily locate pages that are available and bypass any and all pages that aren’t meant to be public. For the sitemap files themselves, there are two different types that you can implement.

The first one is your typical list of individual pages (just like any other sitemap would display). The second type would be used as an index Last Modified Allows you to specify the exact time and date a page was last changed or updated. This should conform to the ISO 8601 format (your can read these specifications at http://www.w3.org/TR/NOTE-datetime) . If you choose not to include the time, the format for the date alone would be YYYY-MM-DD. March 9, 2006, for example, would be displayed as 2006-03-06.

Change Frequency Allows you to specify how often a page will change or be updated. Valid values are always, hourly, daily, weekly, monthly, yearly, and never. Be aware, however, that the value is merely used as a guide and not a command. It’s possible that any given page can be crawled more or less frequently than the specified value. Priority Allows you to specify a number that tells how important you feel any page is in relation to all the other pages on your website. Valid values range from an absolute low of 0.0 to a maximum high of 1.0 (the default priority value of a page is 0.5). Keep in mind that the priority you set has no bearing with regard to what search engine results position your page achieves (if any). It merely tells GoogleBot which page should be given the most importance when crawling your website.

To reduce bandwidth, you have the option of compressing your sitemap files using gzip. Uncompressed sitemap files cannot exceed ten megabytes. Naturally, if you have a relatively small website, managing your sitemap won’t be difficult or overly time consuming. But having a program that automates the process of updating and delivering the sitemap would still be beneficial. Of course, you probably don’t have one small website. You most likely have (or will have at some point) numerous websites with hundreds if not thousands of pages each. And under those circumstances, you an automated system would definitely be an asset.

Sitemap Equalizer ( http://www.sitemapequalizer.com ) is the best program for doing that. Especially if you want to make absolutely certain everything has been taken care of accurately and properly. It provides a powerful web spider that will crawl your entire site beforehand, making certain there are no dead ends or traps where a search engine spider can get stuck in a loop, unable to access all of your pages. For more information about Google’s sitemap service, check out the following pages of their website… Google Sitemaps http://www.google.com/webmasters/sitemaps/ Google Sitemaps Overview http://www.google.com/webmasters/sitemaps/docs/en/navigation.html