Step-A : Visit the home page, www.domain.com.
- Does it redirect to some other URL? If so, that’s bad.
- Review the Page Title. Does it use relevant, primary keywords? Is it formatted correctly?
- Review site navigation:
- Page URLs — look at URL structure, path names, file names. How long are URLs? How far away from the root are they? Are they separated by dashes or underscores?
- Are keywords used appropriately in text links or image alt tags?
- Review home page content:
- Adequate and appropriate amount of text?
- Appropriate keyword usage?
- Is there a sitemap?
- Do a “command-A” to find any hidden text.
- Check PageRank via SearchStatus plugin for Firefox
- View source code:
- Check meta description (length, keyword usage, relevance).
- Check meta keywords (relevance, stuffing).
- Sometimes cut-and-paste code into Dreamweaver to get better look at code-to-page relationship.
Step-B : Analyze robots.txt file. See what’s being blocked and what’s not. Make sure it’s written correctly.
Step-C : Check for www and non-www domains — i.e., canonicalization issues. Only one should resolve; the other should redirect.
Step-D : Look at the sitemap (if one exists).
- Check keyword usage in anchor text.
- How many links?
- Are all important (category, sub-category, etc.) pages listed?
Step-E : Visit two category/1st-level pages.
Repeat A1, A2, A3, A4, and A5 – this will be quicker since many objects (header, footer, menus) will be the same. In particular, look for unique page text, unique meta tags, correct use of H1s, H2s to structure content.
Check for appropriate PageRank flow. Also look at how they link back to home page. Is index.html or default.php appended on link? Shouldn’t be.
Step-F : Visit two product/2nd-level pages.
Same steps as E.
Also, if the site sells common products, find 2-3 other sites selling same exact items and compare product pages. Are all sites using the same product descriptions? Unique content is best.
Step-G : Do a site:domain.com search in all 3 main engines.
Compare pages indexed between the three. Is pages indexed unusually high or low based on what you saw in the site map and site navigation? This may help identify crawlability issues. Is one engine showing substantially more or less pages than the others? Double-check robots.txt file if needed.
Step-H : Do site:domain.com *** -jdkhfdj search in Google to see supplemental pages.
All sites will have some pages in the supplemental index. Compare this number with overall number of pages indexed. A very high percentage of pages in the supplemental index = not good.
(Note: The above is no longer a way to view supplemental results in Google, and Google has said it no longer distinguishes between a main set of results and a supplemental set.)
Step-I : Use Aaron’s SEO for Firefox extension to look at link counts in Yahoo and MSN. If not in a rush, do the actual link count searches manually on Yahoo Site Explorer and MSN to confirm.
That’s what I do when making a quick SEO site analysis. Important: This is for identifying problems, not fixing them. And it doesn’t replace a real and complete SEO analysis. (There are several shortcomings, for example. Here’s one: Steps E and F assume that all category pages across the site will be the same, and that all product pages will be the same. This is not always the case, so you may miss problems/issues that a real, deeper analysis would reveal.)