In search engine optimization, auditing a website is a critical first step to understanding where the site is at today, and how to make critical improvements to it.
In this post I’m going to walk you through many of the most critical elements of a basic audit. Note that there is much more that you can do, so don’t treat these 15 items as a hard limit on how far you choose to go with your audits!
When we start an audit with a client website, I’m fond of telling them that I hope their site is in horrible shape. It may be non-intuitive, but the worse shape the site is currently in, the better off they are.
After all, it means that the audit will offer more upside to their business. At Stone Temple, we’ve done audits that have led to more than doubling the traffic of a client’s site.
An SEO audit can happen at any time in the lifecycle of a website. Many choose to do one during critical phases, like prior to a new website launch or when they’re planning to redesign or migrate an existing website.
However, audits can be an often-overlooked piece of a website’s strategy. And many don’t realize just how much the technical back end of a website impacts their SEO efforts moving forward.
What Are the Fundamental Components of an SEO Audit?
In a nutshell, here are the basic elements of any SEO Audit (click to jump to that section):
- Basic Health Checks
- Keyword Health Checks
- Content Review
- URL Names
- URL Redirects
- Meta Tags Review
- Sitemaps and Robots.txt
- Image Alt Attributes
- Mobile Friendliness
- Site Speed
- Code Quality
The SEO Audit – in Detail
Now, let’s look at the basic crucial elements of auditing a website from an SEO perspective in lot more detail …
Part of having an accessible website for search engines and users is the information architecture on a site—how the content and “files” are organized. This helps search engines make connections between concepts and helps users find what they are looking for with ease.
To think about how to do this well, it’s helpful to compare it to how you deal with paper files in your office:
A well-organized site hierarchy also helps the search engines better understand the semantic relationships between the sections of the site. This gets reinforced by other key site elements like XML Sitemaps, HTML site maps and breadcrumbs, all of which can help neatly tie the overall site structure together.
2. Basic Health Checks
Basic health checks can provide quick red flags when a problem emerges, so it’s good to do these on a regular basis (even more often than you do a full audit). Here are four steps you can take to get a diagnosis of how a website is doing in the search engine results:
- Ensure Google Search Console and Bing Webmaster Tools accounts have been verified for the domain (and any subdomains, for mobile or other content areas). Google and Bing also offer site owner validation that allows you to see how the search engines view a site. Then, check these on a regular basis to see if you’ve received any messages from the search engine. If the site has been hit by a penalty from Google, you’ll see a message, and you’ll want to get to that as soon as possible. They’ll also let you know if the site has been hacked.
- Find out how many of a website’s pages appear to be in the search index. You can do this by going to Google Search Console as follows:Has this number changed in an unexpected way since you last saw it? Sudden changes could indicate a problem. Also, does it seem like it matches up approximately with the number of pages you think exist?
I wouldn’t worry about it being 20 percent smaller or larger than you think, but if it’s double, triple or more, or only about 20 percent of the site, you probably want to understand why.
- Go into Google Search Console to make sure the cached versions of a website’s pages look the same as the live versions. Below you can see an example of this using a page on the Stone Temple web site.
- Test searches of the website’s branded terms to make sure the site is ranking for them. If not, it could indicate a penalty. Check the Google Search Console/Bing Webmaster Tools accounts to see if there are any identifiable penalties.
3. Keyword Health Checks
You’ll want to perform an analysis of the keywords you’re targeting on the site. This can be accomplished by many of the various SEO tools available. One thing to look for in general is if more than one page is targeting or showing up in the search results for the same keyword (aka “keyword cannibalization”).
You can also use Search Console to see what keywords are driving traffic to the site. If you see critical keywords that used to receive traffic are no longer working (the rankings dropped) that could be a sign of a problem.
On the positive site of the ledger, look for “striking distance” keywords, those that rank in positions from five to 20 or so. These might be keywords where some level of optimization could move them up in the rankings. If you can move from position five to three or 15 to eight on a major keyword, that could result in valuable extra traffic and provide reasonably high ROI for the effort involved.
4. Content Review
Here, we’re looking for a couple things:
- Content depth, quality and optimization: Do the pages have enough quality information to satisfy a searcher? You want to make sure the number of pages with little or “thin” content is small compared to those with substantial content. There are many ways to generate thin content.One example is a site that has image galleries with separate URLs for each image. Another is a site with city pages related to their business in hundreds, or thousands, of locations where they don’t do business, and where there is really no local aspect to the product or services they are offering on their site. Google has no interested in indexing all those versions, so you shouldn’t be asking them to do so!This is often one of the most underappreciated aspects of SEO. At Stone Temple, we’ve taken existing content on pages and rewritten it, and seen substantial traffic lifts. In more than one case, we’ve done this on more than 100 pages of a site and seen traffic gains of more than 150 percent!
- Duplicate content: A lot of websites have duplicate content without even realizing it. One of the first things to check is that the “www” version of the site and the “non-www” version do not exist at the same time (do they both resolve?). This can also happen with “http” and “https” versions of a site. Pick one version and 301 redirect the other to it. You can also set the preferred domain in Google Search Console (but still do the redirects even if you do this).
- Ad Density: Review the pages of your site to assess if you’re overdoing it with your advertising efforts. Google doesn’t like sites that have too many ads above the fold. A best practice to keep in mind is that the user should be able to get a substantial amount of the content they were looking for above the fold.
5. URL Names
Website URLs should be “clean,” short and descriptive of the main idea of the page and indicate where a person is at in the website. So, make sure this is part of the SEO audit. Ensuring URLs are constructed well is helpful for both website users and search engines to orient themselves.
For example: http://www.site.com/outerwear/mens/hats
It’s a good idea to include the main keyword for the web page in the URL, but never try to keyword-stuff (for example, http://www.site.com/outerwear/mens/hat-hats-hats-for-men).
Another consideration are URLs that have tracking parameters on them. Please don’t ever do this on a website! There are many ways to implement tracking on a site, and using parameters in the URLs is the worst way to do this.
If a website is doing this today, you’ll want to go through a project to remove the tracking parameters from the URLs, and switch to some other method for tracking.
On the other hand, perhaps the URLs are only moderately suboptimal, such as this one:
In cases like this, I don’t think that changing the URLs is that urgent. I’d wait until you’re in the midst of another larger site project at the same time (like a redesign).
6. URL Redirects
It’s a common best practice to ensure that a web page that no longer needs to exist on a website be redirected to the next most relevant live web page using a 301 redirect. There are other redirect types that exist as well, so be sure to understand the various types and how they function before using any of them.
Google recommends that you use 301 redirects because they indicate a page has permanently moved from one location to another, and other redirects, such as a 302, are used to signal that the page relocation is only temporary. If you use the wrong type of redirect, Google may keep the wrong page in their index.
It used to be the case that much less than 100 percent of the PageRank transferred to the new page through a redirect. In 2016, however, Google came out with a statement that there would be no PageRank value lost using any of the 3XX redirects.
To help check redirects, you can use tools like Redirect Check or RedirectChecker.org.
7. Meta Tags Review
Each and every web page on a site should have unique title tags and meta descriptions tags—the tags that make up the meta information that helps the search engines understand what the page is about.
This gives the website the ability to suggest to the search engines what text to use as the description of its pages in the search results (versus search engines like Google generating an “autosnippet,” which may not be as optimal).
It may also help avoid some pages of the website from being filtered out of the search results if search engines use the meta information to help detect duplicate content.
You’ll also want to take this opportunity to check for a robots metatag on the pages of the site. If you find one, there could be trouble. For example, an unintentional “noindex” or “nofollow” value could adversely affect your SEO efforts.
8. Sitemaps and robots.txt Verification
It’s important to check the XML Sitemap and robots.txt files to make sure they are in good order. Is the XML Sitemap up to date? Is the robots.txt file blocking the crawling of sections of a site that you don’t want it to? You can use a feature in the Google Search Console to test the robots.txt file. You can also test and add a Sitemap file there as well.
9. Image Alt Attributes
Alt attributes for the images on a website help describe what the image is about. This is helpful for two reasons:
- I. Search engines cannot “see” image files the way a human would, so they need extra data to understand the content of the image.
- II. Web users with disabilities, like those who are blind, often use screen-reading software that will help describe the elements on a web page, images being one of them, and these programs make use of the alt attributes.
It doesn’t hurt to use keyword-rich descriptions in the attributes and file names when it’s relevant to the actual image, but you should never keyword-stuff.
10. Mobile Friendliness
The amount of people that are searching and purchasing on their mobile devices is growing each year. At Stone Temple, we have clients who get more than 70 percent of their traffic from mobile devices. Google has seen this coming for a long time, and has been pushing for websites to become mobile friendly for years.
Because the mobile device is such a key player in search today, at the time of writing, Google has declared it will have a mobile-first index. What that means is that it will rank search results based on the mobile version of a website first, even for desktop users.
One key aspect of a mobile-first strategy from Google is that its primary crawl will be of the mobile version of a website, and that means Google will be using the mobile crawl to discover pages on a site.
Most companies have built their desktop site to aid Google in discovering content, and their mobile site purely from a UX perspective. As a result, the crawl of a mobile site might be quite poor from a content discovery perspective.
Make sure to include a crawl of the mobile site as a key part of any audit of a site. Then compare the mobile crawl results with the crawl of the desktop site.
If a website doesn’t have a mobile version, Google has said it will still crawl and rank the desktop version; however, not having mobile-friendly content means a website may not rank as well in the search results.
While there are a few different technical approaches to creating a mobile-friendly website, Google has recommended that websites use responsive design. There’s plenty of documentation on how to do that coming directly from Google, as well as tools that can help gauge a website’s mobile experience, like this one.
It’s worth mentioning Google’s accelerated mobile pages (AMP) here as well. This effort by Google is to give website publishers the ability to make their web content even faster to users.
While Google has said that AMP pages won’t receive a boost in ranking at the time of writing, page speed is, however, a signal. The complexity of the technical implementation of AMP pages is one of the reasons some may choose not to explore it.
Another way to create mobile experiences is via progressive web apps, which is an up-and-coming way to provide mobile app-like experiences on the web via the browser (without having to download an app).
The main benefit is the ability to access specific parts of a website in a way similar to what traditional apps can.
11. Site Speed
Site speed is one of the signals in Google’s ranking algorithm. Slow load times can cause the crawling and indexing of a site to be slower, and can increase bounce rates on a website.
Historically, this has only been a ranking factor when site speeds were very slow, but Google has been making noise that it will become more important over time. Google’s John Mueller has also indicated that a site that is too slow, and which is nominally mobile-friendly, may now be deemed as non-mobile friendly. However, currently, mobile-age speed is not currently treated by Google as a ranking factor.
In fact, site speed has become such an important element of the overall user experience, especially in mobile, that Google has said it wants above-the-fold content for mobile users to render in one second or less.
To help people get more visibility into site speed, Google offers tools such as the PageSpeed Insights tool and the site speed reports found in Google Analytics.
Here, we’re looking at links in a couple different ways: internal links (those on the website itself) and external links (other sites linking to the website).
First, look for pages that have excessive links. You may want to minimize those. Second, make sure the web pages use anchor text intelligently without abusing it or it could look spammy to search engines. For example, if you have a link to the home page in the global navigation, call it “Home” instead of picking your juiciest keyword.
Internal links are what define the overall hierarchy of a site. The site might, for example, look like this:
The site above obviously has a well-defined structure, and that’s good. But in practice, sites rarely look like this, and some level of deviation from this is perfectly fine.
A home page may link directly to some of the company’s top products, as shown in Level 4 of the image, and that’s fine. However, it’s a problem if the site has a highly convoluted structure that has many pages that can only be reached after a large number of clicks if you try to navigate to them from the home page, or if each page is linking to too many other pages.
Look for these types of issues and try to resolve them to create something with a cleaner hierarchy.
Also known as inbound links or backlinks, you’ll want to perform an analysis to ensure there aren’t any problems there, like a history of purchased links, irrelevant links and links that looks spammy.
You can use tools like Open Site Explorer, Majestic SEO, Ahrefs Site Explorer, SEMRush, and the Google Search Console/Bing Webmaster Tools accounts to collect data about links.
Personally, I like to use all of these sources, collect all of their output data, dedupe it and build one master list. None of the tools provides a complete list, so using them all will get you the best possible picture.
Look for patterns in the anchor text, like if too many of the links have a critical keyword for the site in them. Unless the critical keyword happens to also be the name of the company, this is a sure sign of trouble.
Also check that there are links to pages other than the home page. Too many of these are another sure sign of trouble in the backlink profile. Lastly, check how the backlink profile for the site compares to the backlink profiles of its major competitors.
Make sure that there are enough external links to the site, and that there are enough high-quality links in the mix.
Historically, it’s been believed that subdomains do not benefit from the primary domain’s full trust and link authority. This was largely due to the fact that a subdomain could be under the control of a different party, and therefore in the search engine’s eyes, it needed to be separately evaluated.
For an example of a domain that allows third parties to operate subdomains of their site, consider Blogger.com, that allows people to set up their own blogs and operate them as subdomains of Blogspot.com.
For the most part, this is not really true today, and search engines are extremely good at recognizing whether or not the subdomain really is a part of the main domain, or if it’s independently operated.
I still recommend using a subfolder over a subdomain as the default approach to adding new categories of content to a site., However, if you already have it on a subdomain, I would not move it to a subfolder unless you have clear evidence of a problem, as there is a cost to site moves, and the upside of making the move is not enough to pay that cost.
For purposes of an audit, you need to make sure you include subdomains within the audit. As part of this, make sure your crawl covers them, and check analytics data to see if there is any clear evidence of a problem, such as it’s getting very little traffic, or recent material traffic drops.
For more on subdomains and their effect on SEO, see Everything You Need to Know About Subfolders, Subdomains, and Microsites for SEO.
For websites that aim to rank locally, for example, a chiropractor that’s established in San Francisco and wants to be found for “San Francisco chiropractor,” you’ll want to consider things like making sure the business address is on every page of the site, and claiming and ensuring the validity of the Google Places listings.
Beyond local businesses, websites that target specific countries or multiple countries with multiple languages have a whole host of best practice considerations to contend with.
These include things like understanding how to use hreflang tags properly, and attracting attention (such as links) from within each country where products and services are sold by the business.
15. Code Quality
A website with clean code that allows the search engines to crawl it with ease enhances the experience for the crawlers. W3C validation is the “gold standard” for performing a checkup on the website’s code, but is not really required from an SEO perspective (if search engines punished sites for poor coding practices, there might not be much left to show in the search results). Nonetheless, clean coding improves maintainability of a site, and reduces the chances of errors (including SEO errors) creeping into the site.
An SEO audit can occur at any stage of the lifecycle of a website, and can even be performed on a periodic basis, like quarterly or annually, to ensure everything is on the up and up.
While there are different approaches to performing an SEO audit, the steps listed in this article serve as a solid foundation to getting to know the site better and how it can improve, so your SEO efforts get the most ROI.
This article adapted from the book The Art of SEO: Mastering Search Engine Optimization (3rd Edition), Eric Enge lead co-author.