No information about SEO strategy would be complete without mentioning how basic design elements can effect indexing and page rank. And in this instance, what works best for Google basically applies to all search engines.
The first thing you need to understand is this…
When it comes to good optimization, the only one that really matters – the only one you need to satisfy – is the search engine crawler.
Naturally, nice clean design and proper navigation is important to your viewer. But great website presentation and performance isn’t much good if it doesn’t comply with search engine standards or requirements.
Unlike viewers, who can view your website both outside and in, search engine crawlers only get to experience your website from the inside, by following the source code from top to bottom.
Take a look for example how Google sees your site.
And they’re on a specific mission… to locate information that will help index any given page.
Everything in it’s place
If everything is laid out properly, the crawler will have no problem locating keywords that have been deliberately and properly placed within its path.
That allows the crawler to accurately index your web page. Which, of course, is what you ultimately want. Web pages that are indexed according to the keywords that will provide you with the greatest benefit.
If the design is jumbled (or causes the source code to contain a large volume of unnecessary elements), there’s a good chance the crawler will never come up with a viable indexing choice. And since the crawler is always in a hurry, it’s not about to stick around for any additional or
extended length of time on your behalf.
If, on the other hand, the important information – the keywords you’ve carefully and painstakingly chosen – are located in all the right places and used in the proper context, a crawler won’t have a bit of difficultly determining exactly how that particular page should be indexed.
Quick Intro to Onpage SEO
Primarily, those crawler-friendly locations include places like the page title, clearly visible and high-placement < h > tags, and the first paragraphs and/or sentences of the main text content.
Should you ever consider incorporating the most flashy and innovative techniques on your website, think again. Doing so is never going to impress or solicit favor from search engine crawlers. (It probably won’t even impress your human visitors.)
Following is a basic list of what most search engine crawlers can’t process (extract information from)…
• Image text
• Multimedia (such as flash and streaming video)
• Pages that require login or cookies
• PDF files
• Java applets / Flash files / Embedded content (but no-one uses these nowadays right?)
In addition, most search engine crawlers have a hard time with things like frames and dynamically generated content (for example, URLs that include “?”).
If the crawlers can’t navigate your site (and remember, they’re navigating through the source code rather than the outside elements), they can’t properly index your website.
Worse case scenario is that they’ll leave prematurely and never wind up fully indexing your website.
In order to optimize your pages in such a way that you satisfy both human visitors and search engine crawlers, you need to do the following:
• utilize the best keywords for your topic or niche
• place keywords where they are most effective and advantageous
• use keywords in their proper context
• include the correct amount of keywords throughout all locations
As long as you accomplish that, you’ll have a website that’s not only people friendly, but search engine friendly as well.
• The goal is to make your website more popular, more visible, more important than the competition.
• Although it’s not necessary to reach the number one search results position, you need to aim there in order to land anywhere near the top.
• If your description more closely matches what a viewer is searching for, they’ll go to your website first regardless of what your results position happens to be.
• It’s not exclusively about position. It’s about targeting a specific keyword and then making certain your website 1) ranks high for that keyword and 2) can deliver what the viewer is searching for.
• In order to compete with websites in top results positions, you need to find out what they’re doing and then do the same thing, only better.
• If you limit your optimization efforts to the top search engines and directories, you can cover the most important SEO bases simultaneously.
• Take advantage of all the free SEO webmaster tools that Google and other websites have available.
• Use Google Sitemaps to make certain the crawler finds all available pages/
• Use Google Sitemaps to help get your pages indexed faster.
• Submit XML sitemaps so you can take advantage of the notification options such as the date a page was last modified and the frequency you anticipate a page will be changed or updated.
• Indicating priority only tells how important a page is in relation to all the other pages on your website. It has no bearing on what position your page will hold in search engine results.
• Use Sitemap Equalizer ( http://www.sitemapequalizer.com ) to create and manage all of your sitemaps.
• Don’t design your web pages for viewers only. Design them to help search crawlers easily and quickly locate the specific information and keywords that you want your page indexed for.
• Crawler friendly locations include the page title, high placement < h > tags, and the first paragraphs or sentences of the main text content.
• Most search engine crawlers can’t extract information from image text, multimedia such as flash and streaming video, pages that require login, PDF files, XML, and Java applets.
• Most search engine crawlers have a difficult time with things like frames and dynamically generated content and pages.
• To satisfy both humans and crawlers, you need to utilize the best keywords, place keywords where they are most effective, use keywords in their proper context, and include the correct amount of keywords throughout.