The Best SEO Strategies in 2009

Search engine optimization, better known SEO, is the most important single investment any website owner can make for their ongoing success. Every year the strategies that go into SEO become slightly more specific and slightly more nuanced. Search engines themselves are always changing, becoming more accurate and more effective in responding to specific user inquiries, and more selective about sites that may be spam. That’s why website owners everywhere need to reevaluate their search engine optimization strategies every year to make sure they are still employing the best strategies available.

Once upon a time, in the’90’s, SEO was very simple. You built your website. You added some meta tags. You chose keywords, probably repeating them over and over, hoping to draw traffic. Ten years ago and more, spider technology was relatively new, and users were easier to please. The Internet was sufficiently new that users were satisfied with simple matches that were just “kind of close” to what they were seeking.

In 2009, however, search engines have evolved to make search more accurate. Google is the prime example. Ten years ago Google’s algorithm counted back-links and prioritized. You got more credit for a back-link to a desirable site, and, as the system developed, you were penalized for having back-links to a spam site. Putting keyword anchor text into links would optimize your site for those keywords, even if they did not appear in your content.

Keyword-rich links from high-value external sites is still the most effective way to raise your page rankings on Google. Nowadays, however, content also counts. Google and the other search engines have been working hard to discern quality content from content that may not be spam, but isn’t original and informative and really worth a visitor’s time. To a certain extent, your website still has to win a popularity contest, because back-links continue to count. But the new natural language filters don’t just differentiate between natural language and spam. They give you increasing credit for increasing readability and originality of your content.

Certain language patters appear in literature, in news reports, and in countless other sources online. Google and other engines have started distilling these word patterns and integrating them into their algorithms to rule out websites that are clearly keyword stuffing. This makes quality content a lot more valuable than it once was. Website owners once could optimize a page by stuffing it with a 7% keyword density. Nowadays, a 3 to 4% keyword density is optimal. More is not better. If keyword density looks unnatural to a search engine, it will reject the page, even if it is not spam.

The top strategy for SEO in 2009 is to make natural copy with just enough keyword density. Keyword-rich back-links to high page-rank sites are still important, and you still need the 3 to 4% keywords on every page. Stay within these limits and you will find your 2009 search engine results much better.

Justin Harrison is a leading Internet Marketing consultant responsible for the Internet Marketing strategies behind some of the biggest online brands including Amazon, BBC, MasterCard and many others.

Posted in Uncategorized

Leave a Reply