Finally: SEO Best Practices

Jeremy Pratte
September 26, 2008
Company News , Digital Marketing

Ever since search engines were invented webmasters have been clamoring to get their sites listed as one of the first entries in search results and using techniques of variable credibility to make it happen. Remember hundreds of “keywords” placed between comment tags? Or in invisible text or ridiculous repetition of certain words in the content?

Those were the old days of techniques that have since been termed “Search Engine Optimization,” or SEO.

At first SEO was indeed about what was in the meta tags, what you had in your keywords and description. Initially it really was the only legitimate and reliable method, i.e. NOT spamming, which search engines have despised ever since those questionable techniques have been recognized as such. For a while, in the late 1990s and early 2000s, the rule of thumb about SEO was that there was no rule of thumb, that once methods became effective and truly bumped up websites to the top of the results, once everybody else found out the technique, it became ineffective simply by virtue of everybody else doing it, too. This made it difficult to say with certainty what SEO best practices actually were.

But eventually the world of SEO has become stabilized, largely because the search engine focus has narrowed significantly. MSN, Yahoo!, and especially Google – arguably the king of search engines – have emerged to become the industry leaders. And once a webmaster understands how each one of those work, especially Google, the scope of SEO can thankfully be narrowed because if you’re optimizing for them it’s a pretty safe bet that you’re good to go on any other search engines. As a result of this scope narrowing, certain methods have become timeless and (unless a website pays for premium listings) who is at the top of the list can change every day. Heck, even every hour.

Today, SEO of a website has become almost as important of a step in its development as the design, production, and development. Actually, it has become interwoven into the production process, not really a separate process. That’s the way it should be anyway. It’s so important that virtually entire web production conferences have been held to discuss it. I actually attended one in December, 2007. The techniques have become sophisticated and, at the same time, quite simple. SEO is still a paradox at times, but now in a different way. It’s almost an art, and to master it, you’ve got to know the new techniques to optimize your site for Google. Er, I mean, search engines.

The tried and true techniques are still around – the meta tags – but they have taken a back seat (waaaay back) even though it’s still a good idea to fill them out. Also important is what’s in the title tag. Not only can that influence how your website is spidered by search engine bots but the content of which will usually become a website’s main reference in search results. And of course you don’t want to SPAM, the anti-SEO method, which will get your site blacklisted.

But these old methods are joined by new ones. Some of them have actually been around since the beginning, but were only recently recognized as effective ways to beef up SEO.

So what are these new (or sometimes old) techniques, you might ask?

Let’s start with hyperlinks. Ever since there have been webpages there have been hyperlinks. Linking between internal pages and different websites has been crucial to the development of the web (where would search engines be without them?). Now, your site’s “linkability” is intrinsically tied to its SEO. This is one reason blogs rank so high in search results. One of the speakers at the conference I attended – Web Builder 2.0 (http://webbuilderconference.com) said his blog site about a bicycle actually ranked higher in the results than its company’s! One of the reasons was that blogs, and that one was no exception, place heavy emphasis on linking. There’re hyperlinks on the right and left rails for various reasons (other similar things readers might be interested in, etc) and there’s also tagging and links-a-plenty in the actual content of the blogs and even possibly the comments.

But, why is that so important?

Google’s bots track how much traffic is directed to your site from hyperlinks (and from whence they came) and how much traffic is directed away from your site to other sites. And that boosts your SEO. Why? Google’s bot thinks this way: if your webpage is linking to other popular sites of value then that adds value to your page, especially if those sites link back to you. It’s like popularity by word-of-mouth. If other sites are talking about your site then that makes yours important, too. Another way the bots look at it: in terms of academia, your website is like a paper with references, and as you might know the more references your papers had, the more important it was. Or, at least, they made the papers look more important. So, the lesson here is, as long as the links are relevant (again, NO SPAMMING!), link away!

What other techniques are there? There’s another thing that’s been around a long time but has only recently been recognized as important to SEO: alt-text on images.

Image alt-text – you know, those boxes of text that appear when your mouse hovers over an image – used to be regarded by some to be annoying and sometimes they were even avoided (I used to work for a firm where alt-text on images was denounced). In the past, if it was used, alt-text was mostly seen as a way to optimize a site’s accessibility, i.e. for the blind, as they use programs that read text to them since they cannot see the site. Google and Yahoo!, as you might know, have image searches, but this is only one of the reasons alt text is important. Why, you might ask? Well, think about it. Search engine bots are blind, too! They need them for the same reasons blind users need them. They essentially act as descriptions for images, and that means even more descriptive content on your webpages. It’s even one of Google’s Webmaster guidelines to use alt-text. Just remember, doing something small like adding alt=”[text describing image]” to all your image tags can go a long way to enhancing your website’s SEO.

Oh, make sure the alt-texts are clear and concise, though. If you stuff them with keyword spamming, blind web surfers, who have to listen to their programs rattle off all of those words, will hunt you down and cane you. And of course the search engines will denounce your site, too.

The main point about alt-text is that enhances your website’s content and anything you can do to enhance that will boost SEO. And that brings us to what is arguably the most important thing:

Your site’s CONTENT.

Ah, now we come to the simplest and, ironically, maybe the most overlooked way to ratchet up your website’s SEO: good content. Good in its quantity, but, more importantly, good in its quality. This is another area where blogs get search engine-rank hot streaks. Their content often has more value than the marketing fluff you might find on a lot of corporate websites because of two reasons: One, it tells the user more about the product or service – the good and the bad – than the marketing-speak you normally find. Two, it changes often.

Search engines loooove dynamic content. If you’re copy writing for a corporation’s website, there may not be much you can do about marketing-ease. But you can make the site you produce more dynamic.

The bicycle blog I mentioned earlier also ranked high because its content was more valuable and dynamic than the monolithic never-changing corporate website. It’s strange to think about it, but bots get bored with sites that have text that doesn’t change for long periods of time. That changed, though, for the bicycle site. Now it is more dynamic. And it does rank higher now than the blog (at least last time I checked). The site’s content is more manageable. Having some sort of Content Management for websites nowadays is essential. Static sites are going the way of the dinosaurs. At Roundedcube, we deal almost exclusively in producing websites so that the content can be managed by our clients. We prefer the Sitecore Content Management System, of which we are a Certified Solutions Partner, one of the top in North America I might add.

Being able to easily manage content makes it easier to change it often. Websites are less likely to sit around collecting dust. Bots like having something new to look at and your site will be placed higher when it sees new content. More importantly, it could be ranked higher than the sites of your competitors, if they weren’t as smart as you in choosing a CMS, like Sitecore.

Or if they did not choose a web design and production company like Roundedcube who knows Content Management Systems and SEO well.

There are various other little things you can do to help SEO. The “revisit-after” meta tag is a useful, oft-overlooked tag that can tell a search engine bot when to return to your site. The “robots” meta tag you can actually use if for some reason you do not want a page to be spidered, which for various reasons can help the site’s SEO as a whole, depending on what’s on the page. Use CSS to modify the classic header tags to your liking (H1, H2) so that you may use those for a page’s headers instead of images. Those tell search engines what and where important content is. Even though it’s tricky, depending on how pretty it needs to be, using unordered lists for a website’s main navigation (ul and li’s) helps SEO because search engine bots like them, too, because you’re using actual text – as opposed to images – and lists are one of the things that bots look for. Use current HTML, like and tags for bold and italics, respectively, and ditch the old and tags. Also on that, search engine bots use those tags in regards to keyword relevance in your content (although it may not affect your page rank overall) so it’s probably a good idea to use those HTML tags to bold and italicize whenever possible instead of doing it in the CSS, for search engine bots generally don’t look at CSS files. Finally, having a custom 404 error page for your website is nice in case a user does accidentally get lost or wanders into your site via a link on another website to a page on yours that does not exist any longer.

Things to avoid in terms of SEO: all-Flash sites. If they’re mind-blowingly awesome enough you’ll get plenty of traffic. But, even though Flash is working on rectifying the problem, still having an all-Flash site can hurt SEO. Until recently, bots had no way of reading any of the content in a Flash movie. Alternate content, like the alt-text for images, can help this. Here’s a helpful webpage in terms of Flash SEO: http://www.hochmanconsultants.com/articles/seo-friendly-flash.shtml

Of course, as I already mentioned many times, don’t SPAM. Don’t put a million key words into your meta tags or image alt-text. Don’t have commented out or hidden keywords, either! Search engines have gotten smart and will smite your website from the search results if they suspect you deal with any of these techniques!

Make sure your title tags don’t say Untitled Document. Or any other default title tag content. Duh. Another no-brainer: broken image tags or broken links. Make sure there isn’t anything dumb like that and, if you link to external websites, check them every once in a while to make sure they’re still valid. Users hate getting 404 pages when they click on links on your site. As do search engine bots.

Change your content almost as often as you change your car’s tires. In other words, get rid of any references to being “Y2K Compliant!” (I can’t believe I still see that sometimes.)

For more information on SEO, check out Google’s Webmaster help center, http://www.google.com/support/webmasters/, which has tips on SEO and many other useful subjects. Or just Google “best practices SEO.”

Happy Googling! And, good luck!

comments powered by Disqus

STRATEGIC PARTNERS