This is an internal article that our team uses to diagnose issues with client websites. I'm posting it here so that you might find some value. You don't have to be a programmer to understand it and making the fixes is not a difficult proposition. If you like it, please reply here and let me know your thoughts. Also, if you want your site reviewed, post it to the forums and we'll reply-- the caveat is that everyone will see the feedback. Anyways, enjoy!
internet advertising for small businesses
SEO in 12 Easy Steps
1) Check for canonical domain issue: do the www and non-www version of the homepage go to the same url? If not, then look at
www.mattcutts.com/blog/seo-advice-url-canonicalization/ for an understanding of how duplicate content should resolve to a single page. If php site, then mod_rewrite. If asp, then isapi_rewrite. If you're giving a SEO presentation, talk about permanent (say 301 to sound fancy) redirects. We need server access to fix this.
2) Dynamic looking urls: Closely related to the above issue on url rewrite, the site should not pass parameters in the url. For example, mysite.com/?page_id=34§ion=2&session_id=123abc is no good. Question marks and equal signs in the url mean that parameters are being passed. Instead, the url should look like mysite.com/large_blue_widgets-that is more descriptive from a user standpoint, plus helps robots index more easily. That is a static looking url, but could be dynamically generated.
4) Unique page titles: Starting from the homepage, do they have page titles that are just a few words long (65 characters or less, if possible) and begin with their most important terms? The important terms are what we deem to be both relevant and high search volume. Do not start the page title with the name of the site unless it's a strong brand. Make sure each page title is reflective of what that particular page is about.
5) Sitemaps: There should be a link to "sitemap" on the home page-and that sitemap should have links with anchor text (the words highlighted in bold) that are key search terms. For example, "our products" is terrible, while "peoplesoft testing software" is relevant. You should also be able to grab mydomain.com/sitemap.xml. See more at http://www.sitemaps.org/, where you can learn to use Google Webmaster Central to tell Google what pages you want crawled, plus check for errors.
6) Run SEO reports: Put in the client's url at seomoz.com/page-strength. This site gives you a couple dozen SEO metrics, which you can paste into a report and highlight as red (bad) and green (good). If you can afford the $79 per month, SEOmoz has an unbelievable tool to measure how your site is performing in the search engines-- how trustworthy it is, the value of the links into your site, and so forth.
7) Run PPC competitor reports: Go to Spyfu.com and do a few searches. Put in a few keywords that you think are important to the site, plus the site url itself. If you're looking at a keyword, it will come back with who is buying it on PPC, what it costs, how many clicks are available, and what terms are related. If you're looking at a url, it will tell you what terms (if any) they are buying on Google, what terms they rank naturally on, and who are related competitors. Remember that is just Google-this company specializes in writing scrapers, which are robots that repeat millions of search results to see who comes up.
8) Source code: Do a view > page source from your browser, starting from the home page. Do a find (control F) for H1 and see if any come up. If so, are they using keywords that they want folks to find them on? H1 tags carry more weight than regular text, but are not as strong as the page title. Do you see frames on the site (look for "frame")-perhaps 1 out of 15 sites you see will be just one page with the rest of the content inside a frame. Thus, search engines see only 1 page. At the top of the page, do you see a meta description with good keywords woven into a sentence? The meta description should mimic ad copy you'd have in AdWords.
9) Analytics code: While still on that view of the code (mentioned in step 7), find instances of "google", "analytics", or "urchin". You may also see examples of other tracking software, too, towards the bottom of the body. The majority of sites will have Google Analytics installed-if they are small businesses, then we recommend that as a cheap, effective solution. Large sites may include Omniture SiteCatalyst, ClickTracks, or WebSideStory-easy enough to see.
10) Validation: Go to validator.w3.org and put in the url. Like the other tools, it will return usually a few dozen errors with explanations. Usually, you'll see tons of missing attributes, which can mostly be ignored.
11) seoquake.com: Another great Firefox browser plug-in to show you Google PR, pages indexed by Google, pages indexed by Yahoo, backlinks (how many sites link to you), age of site, and Alexa rank (popularity). As you browse sites, it's handy to see these stats. Glaring problems are if there are less than a few dozen pages in the Google index, less than a few dozen backlinks, Google PR of n/a on many pages (especially bad if true on the homepage), and Alexa rank of greater than 500,000 (lower is better), which means almost no traffic.
12) Run SEOmoz report: Go to seomoz.org/trifecta via our login and put in the prospect's url. It will bring back ranking factors and suggestions-it is perhaps the easiest and most powerful of the 12 steps here. The overall percentage score means little.
Now take action to make these fixes! BlitzLocal.com or your local web expert can help you. Some of them you can even do yourself-- don't be afraid to try.