I post at SearchCommander.com now, and this post was published 16 years 5 months 17 days ago. This industry changes FAST, so blindly following the advice here *may not* be a good idea! If you're at all unsure, feel free to hit me up on Twitter and ask.
Last week as I was writing “Beyond the top 10 SEO Factors” I realized that it was either going to have to be a novel, or I was going to have to follow up quickly with a second post, which is what this is.
In the case of the suggestions below, some are based on proven facts, and there’s a little bit of theory, some actually still under debate, but it’s my belief that they are all accurate, worth the effort, and solidly white hat, and you can count on this page being updated if anything changes.
21. Canonical Issues – The www.vs non www
Google sees subdomains as separate domain names, and the www prefix is just another subdomain. Therefore, it’s important to remove any possible confusion on their part by eliminating one of the two versions from displaying at all. My own preference is to always use the www.
Inside Google Webmaster tools, you can go to your domain account, then go to Tools and to “set preferred domain” which will let them know that you would prefer to use the www.
This step is not enough on its own however, (some people are even skeptical that this works at all) and it’s best to take care of it at the source, on your server, by simply not allowing the duplicate version to show in the first place.
On an Apache server using mod rewrite, I have placed the following code in MY .htaccess file to eliminate the appearance of the non www version of any page. You can do the same, (but using your OWN DOMAIN NAME)
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^pdxtc.com [NC]
RewriteRule ^(.*)$ http://www.pdxtc.com/$1 [L,R=301]
If you’re running your domain on a Windows server, see the next item, because just like everything else on a Windows server, it’s a far bigger pain in the neck.
22. Windows Servers – Canonical issues & Page versions
Besides needing to be rebooted all the time, and being a general pain in the neck to work on, Windows servers have their own unique issues when it comes to canonicalization and content duplication.
Since there’s no such thing as an .htaccess file, there is no easy and quick fix to the www vs. non www issue, and there’s also a whole host of other problems that can come up.
In many cases I have seen websites that have multiple versions of the same page, depending on whether or not capitalization was used in the URL, or depending on which designer may have created any certain
link on the site based on their personal style.
Some designers will use a capital letter for the first letter in a file name, some designers will use capitals for every word in the file name, and some will make things simple by never capitalizing filenames, (which is my preference).
For example, these may all be meant to display the same page, but to the search engines they are completely different, they can be the source of the duplicate content, and I’ve even seen them end up with different PageRank and inbound link counts –
http://www.domain.com/PageName.asp
http://www.domain.com/Pagename.asp
http://www.domain.com/pagename.asp
If you’re running your site on a Windows server you can resolve these problems by becoming best friends with a program called ISAPI Rewrite 2.0 – URL Rewriting for Windows Server IIS –
That program will allow you to create and implement the proper rules, to avoid duplication of page names with capital letters, and removal of the www. subdomain issue.
Learning exactly how to use it is your problem though, and I find it far easier to simply avoid hosting on a Windows server altogether π
23. Bloated Code and Poor Load Times
Besides the obvious, (like not putting high-resolution large photos on your page) there are other things that can slow down the loading of a webpage, and just like most people, search engines absolutely hate slow webpages.
With all of the cool stuff we’re adding to our sites and blogs these days, including multimedia, tracking software, and even certain forms of advertisements, the code on some websites can become so bloated,
that it overwhelms the actual content in not only load times, but in actual volume too.
I’ve looked at the source code of webpages before, and literally had to scroll more than halfway before even getting to the first real word of actual text on the page. This is really a bad thing for the search engines.
I’m certainly not claiming that I know specifics, like the “magic ratio of code to text is 17%” that Google considers to be “optimal”, but I do know that having too much code is a serious problem.
Be sure to make use of include files to call your scripts wherever possible, and just clean up or take out all of your unnecessary code.
Scroll line by line through your templates or pages, and see what can be removed or can be called from an external file. You would really be surprised what you can live without, and how it might speed load times.
A great tool for evaluating your site is a Firefox extension called YSlow , which is used in conjunction with Firebug. It’s detailed analysis can be quite helpful in improving your site performance, without having to guess at various solutions.
24. Intelligent use of rel= “nofollow” – a.k.a. Sculpting
One of the primary ways Google works is that it passes PageRank through link text, from one page to another, and it also assigns relevancy to the landing page based on what words are used for the anchor text, and even what words are surrounding that anchor text.
The more pages on the Web that link to a specific page with a certain phrase, the more Google believes that page is “important” and should higher for that phrase.
This is standard operating procedure for Google, and you’ll know the day that ever changes, because the download page for Adobe Acrobat reader will no longer rank #1 for the phrase “click here”.
Put simply, some pages simply were not meant to rank, like your site’s privacy policy, security policy, member login pages, and possibly lots of other things that only exist for the users benefit, but are definitely not something you want to waste your PageRank on.
Therefore, an effective tactic to prevent wasting your “Link Juice”, is the use of the rel= “nofollow” tag. Google does not pass PageRank through no followed links, and that’s why people call no follow tags “link condoms”.
Proper implementation of this tag would look like this –
<a href=”http://www.domain.com/” rel= “nofollow”>anchor text</a>
I’ve always believed that you are far better off ensuring that the only link that’s followed is one that has the appropriate anchor text. That’s why on the homepage of my own blog, the permalink title to each post is normal, but the “read more” link is rel= “nofollow”.
Some might say this is a completely unnecessary, and in fact it was at Searchfest 2008 that I first heard “officially” that the search engines will only follow the first link they come to on a page.
I heard it from Rand Fishkin, and until that point it was always my assumption that unless it was no followed, all links passed juice.
However, even though Rand made it clear that it seems to be only the first instance of any link on a page was followed, I still don’t want to risk wasting my PageRank through that incorrect text just in case Rand is wrong, or in case Google changes the rules of the game, so I still use it.
*update – June 2009 – Matt Cutts from Google announced officially, that a link which is nofollowed, will *NOT* allow it’s link juice to be siphoned off to the other links on the page. Β Bad news for the heavy link sculptors and especially for us, and I had to kill off our cool project wor WordPress, the SEO Automatic NoFollowizer
26. Image Links all nofollowed
Where do your image links go? Are they represented equally by text? You’re going to be far better off using rel= “nofollow” on any image links in your header, your menus, or pretty much anywhere else throughout the site, and instead ensuring that the same page you are linking to is represented elsewhere by good anchor text.
This isn’t “cheating” or “black hat”, it’s just good common sense. If intelligent use of the nofollow attribute is one of the few advanced SEO tactics still available, then why wouldn’t someone make use of it? Matt Cutts has already stated publicly that there’s no such thing as an “over optimization” penalty to Google, so it just seems logical to me.
Also, if Rand Fishkin is correct, (see item 24 above) then this is suddenly even more important, since you’re not getting any of your own text link love for anything that might be represented by an image on the page that’s appearing before a text link.
26. Keep Your Content Updated
On a large website, once a page is created, it can sit for months or even years before anything changes on it at all, and I think this is a mistake.
Adding a dynamic component to your website that will add or change content on regular basis is a good way to freshen your page, and appear to the search engines as if the information is more current.
The easiest way I know of to do this is through the use of RSS feeds, which will allow you to have news or strategic items of interest appear automatically, each time the source is updated.
In an ideal situation, your own original content would be used to change information on these pages. However, the lazy way out is to just use subject relevant industry news feeds from other sources.
A basic example would be the top right section of this blog, where my most recent blog posts freshen the content of every single page on the entire blog.
Another example of this in action, this time delivering more targeted and page relevant content, would be on my Search marketing speaker page, where I have the RSS feed of my “Public Speaking” category feeding below the contact form.
Every time I add a new post to that particular category of my blog, the content on that speaking page changes, even though I never have to touch it manually.
Depending on how your website is designed there are multiple ways to add RSS feeds, but we created a tool about a year ago that is versatile enough to allow you to add RSS feeds to any website.
27. Leverage your older links
I guarantee that you would be surprised at how much control you might have over some of the links you have obtained in the past, simply by contacting the webmaster or business in question, and asking them to make a change for you.
Let’s say you have a link from another local business that is pointing to your homepage, but it just looks like this http://www.domain.com. Since you already have a relationship with them, and they have recommended you with a link, do you really think it would be that difficult to get them to give you decent anchor text, or perhaps link to a more relevant (deeper) page in your site?
What about getting them to allow you to provide them with a unique article, giving you perhaps 2 or 3 different inbound links to deeper landing pages with the anchor text of your choice?
It’s far easier to contact the webmaster of a site that is already linking to you and get them to improve upon your existing link, than it is to try to get a new link.
28. Build links to your links
Have you ever done some link building for your own backlinks? If you have an opportunity to strengthen a page that already has an inbound link to you, then you can make the most of it by linking back to it from your domain and other sites too.
For example, if you’re a local plumber, and you’ve gained inbound links from your local Chamber of Commerce and your state Plumbers Association, then take advantage of the opportunity to strengthen those particular pages by sending them inbound links from not only your own domain but from anywhere else you might have the opportunity.
If you’re writing a profile about yourself or company to display somewhere else on the web, be sure to link not only to your own site, but mention these relevant associations that you belong to and link to your own profile pages.
Boosting the visibility (and PageRank) of any page that already links to you, will then benefit you directly through their links back to you. This is precisely why, on the left side of my blog, in most cases, those “association” links go to my individual profile pages, rather than just the homepage of whatever association I’m linking to.
29. Harvest your own Low Hanging Fruit
Many websites often have desirable rankings lurking on page 2 of the search results, and may not even be aware that they exist. This is because so few people ever click through to page 2, and many bacic stats programs only show the most popular referring key phrases. This is a huge area of missed opportunity.
Determining what phrases your website ranks for on page 2 is made easy with a free tool called SEODigger, which allows you to put in a domain name, then quickly see all of the phrases for which that website ranks, as well as the Wordtracker/Overture data for each phrase. Typically the results are from data 3 to 5 weeks old, but it can still be very useful for identifying that low hanging fruit.
How do you move a phrase from page 2 to page 1? Well, aside from developing more extra links to those pages with proper anchor text, take a look at the next item for an easy bump.
30. Drink your own Link Juice
Looking back at your old pages, your old blog posts, and even blog categories or archived months, you can typically find pages with decent PageRank that you can exploit to your own benefit.
To spell it out more clearly, I’m suggesting that you go into some of your older pages, and add or edit some links with good anchor text pointing to critical areas of your own site that you wish to improve.
If you want to know exactly which pages on your website would be best suited for adding your own internal link to another section, then simply use Google to do a site: search for that particular phrase, and Google will tell you by order of those results which are the most valuable.
For example, if I do this search at Google, site:www.searchcommander.com ppc panagement I can see that I have five pages that rank for PPC management (yuck).
By going back in and editing some text links into the content of the other pages, all pointing at my most desirable search result, it’s highly likely that I can improve my own ranking for that phrase. (But I won’t, because I hate managing PPC).
(It’s important to note that if you search your own domain on Google for a phrase, and get no results, then you have a bigger problem.)
Once again, please keep in mind that the items on this list are all just pieces of the pie, and and no single tactic or strategy is going to make or break you.
By combining these items with what’s in my first two articles, Top 10 SEO Factors and Beyond the Top 10 SEO Factors you should have quite an arsenal that will put you ahead of the competition. At that point, it should pretty much just be creating great content and generating links that can become your focus.
21, 23, 24 – these are very important ones to pay attention to… π
Thanks Susan…
Oops – it looks like my #26 above might be unnecessary, assuming that Google US behaves the same way as Google UK – This test seems to indicate that image links are ignored when followed by a text link elsewhere on the page…
The problem I see with #28 is that, if you spend time and effort building a link to a partner site, then the incoming link juice gets spread among ALL the outbound links on the page… not just your own. And those other outbound links may even be your competitors!
That’s true Paul, but in addition to creating good content and getting good links, the search engines demand that the people linking to you have links too. If they don’t, then their links are worth less to you, and to whomever else is on that page…
But, would I build PageRank and links for a site that also links to all my competitors? Nope, probably not, and that’s a good observation, thanks.
Scott Id agree with you. Building links to pages that link to you inevitably is the same as building equity in someoneelses property.
Sometimes the effort you put into someone elses property can help you – If I have a panoramic view of the city, but my neighbor directly in front of me has a giant bushy tree blocking part my view, I would gladly pay for that tree to be removed.