Components of an Informative SEO Audit


In search engine optimization, auditing a website is a critical first step to understanding where the site is at today, and how to make critical improvements to it.

In this post I’m going to walk you through many of the most critical elements of a basic audit. Note that there is much more that you can do, so don’t treat these 15 items as a hard limit on how far you choose to go with your audits!

When we start an audit with a client website, I’m fond of telling them that I hope their site is in horrible shape. It may be non-intuitive, but the worse shape the site is currently in, the better off they are.

After all, it means that the audit will offer more upside to their business. At Stone Temple, we’ve done audits that have led to more than doubling the traffic of a client’s site.

Implementing the recommendations of a good #SEO audit is often enough to significantly raise traffic.Click To TweetAn SEO audit can happen at any time in the lifecycle of a website. Many choose to do one during critical phases, like prior to a new website launch or when they’re planning to redesign or migrate an existing website.

However, audits can be an often-overlooked piece of a website’s strategy. And many don’t realize just how much the technical back end of a website impacts their SEO efforts moving forward.

What Are the Fundamental Components of an SEO Audit?

In a nutshell, here are the basic elements of any SEO Audit (click to jump to that section):

  1. Discoverability
  2. Basic Health Checks
  3. Keyword Health Checks
  4. Content Review
  5. URL Names
  6. URL Redirects
  7. Meta Tags Review
  8. Sitemaps and Robots.txt
  9. Image Alt Attributes
  10. Mobile Friendliness
  11. Site Speed
  12. Links
  13. subdomains
  14. Geolocation
  15. Code Quality

The SEO Audit – in Detail

Now, let’s look at the basic crucial elements of auditing a website from an SEO perspective in lot more detail …

1. Discoverability

You want to make sure you have a nice, accessible site for search engines crawlers. This means that a site’s content is available in HTML form, or relatively easy to interpret JavaScript. For example, Adobe Flash files are difficult for Google to extract information from, though Google has said that it can extract some information.

Part of having an accessible website for search engines and users is the information architecture on a site—how the content and “files” are organized. This helps search engines make connections between concepts and helps users find what they are looking for with ease.

To think about how to do this well, it’s helpful to compare it to how you deal with paper files in your office:

Website architecture is like a good office filing system

A well-organized site hierarchy also helps the search engines better understand the semantic relationships between the sections of the site. This gets reinforced by other key site elements like XML Sitemaps, HTML site maps and breadcrumbs, all of which can help neatly tie the overall site structure together.

Well-structured site architecture helps search engines understand your site.Click To Tweet

2. Basic Health Checks

Basic health checks can provide quick red flags when a problem emerges, so it’s good to do these on a regular basis (even more often than you do a full audit). Here are four steps you can take to get a diagnosis of how a website is doing in the search engine results:

  1. Ensure Google Search Console and Bing Webmaster Tools accounts have been verified for the domain (and any subdomains, for mobile or other content areas). Google and Bing also offer site owner validation that allows you to see how the search engines view a site. Then, check these on a regular basis to see if you’ve received any messages from the search engine. If the site has been hit by a penalty from Google, you’ll see a message, and you’ll want to get to that as soon as possible. They’ll also let you know if the site has been hacked.
  2. Find out how many of a website’s pages appear to be in the search index. You can do this by going to Google Search Console as follows:search console indexingHas this number changed in an unexpected way since you last saw it? Sudden changes could indicate a problem. Also, does it seem like it matches up approximately with the number of pages you think exist?

    I wouldn’t worry about it being 20 percent smaller or larger than you think, but if it’s double, triple or more, or only about 20 percent of the site, you probably want to understand why.

  3. Go into Google Search Console to make sure the cached versions of a website’s pages look the same as the live versions. Below you can see an example of this using a page on the Stone Temple web site.Google Search Console fetch and render
  4. Test searches of the website’s branded terms to make sure the site is ranking for them. If not, it could indicate a penalty. Check the Google Search Console/Bing Webmaster Tools accounts to see if there are any identifiable penalties.

Learn how to do a basic site health check as part of an #SEO audit.Click To Tweet

3. Keyword Health Checks

You’ll want to perform an analysis of the keywords you’re targeting on the site. This can be accomplished by many of the various SEO tools available. One thing to look for in general is if more than one page is targeting or showing up in the search results for the same keyword (aka “keyword cannibalization”).

You can also use Search Console to see what keywords are driving traffic to the site. If you see critical keywords that used to receive traffic are no longer working (the rankings dropped) that could be a sign of a problem.

On the positive site of the ledger, look for “striking distance” keywords, those that rank in positions from five to 20 or so. These might be keywords where some level of optimization could move them up in the rankings. If you can move from position five to three or 15 to eight on a major keyword, that could result in valuable extra traffic and provide reasonably high ROI for the effort involved.

For great #SEO opportunities, look for striking distance keywords.Click To Tweet

4. Content Review

Here, we’re looking for a couple things:

  1. Content depth, quality and optimization: Do the pages have enough quality information to satisfy a searcher? You want to make sure the number of pages with little or “thin” content is small compared to those with substantial content. There are many ways to generate thin content.One example is a site that has image galleries with separate URLs for each image. Another is a site with city pages related to their business in hundreds, or thousands, of locations where they don’t do business, and where there is really no local aspect to the product or services they are offering on their site. Google has no interested in indexing all those versions, so you shouldn’t be asking them to do so!This is often one of the most underappreciated aspects of SEO. At Stone Temple, we’ve taken existing content on pages and rewritten it, and seen substantial traffic lifts. In more than one case, we’ve done this on more than 100 pages of a site and seen traffic gains of more than 150 percent!
  2. Duplicate content: A lot of websites have duplicate content without even realizing it. One of the first things to check is that the “www” version of the site and the “non-www” version do not exist at the same time (do they both resolve?). This can also happen with “http” and “https” versions of a site. Pick one version and 301 redirect the other to it. You can also set the preferred domain in Google Search Console (but still do the redirects even if you do this).
  3. Ad Density: Review the pages of your site to assess if you’re overdoing it with your advertising efforts. Google doesn’t like sites that have too many ads above the fold. A best practice to keep in mind is that the user should be able to get a substantial amount of the content they were looking for above the fold.

A thorough content review is an essential part of any #SEO audit.Click To Tweet

5. URL Names

Website URLs should be “clean,” short and descriptive of the main idea of the page and indicate where a person is at in the website. So, make sure this is part of the SEO audit. Ensuring URLs are constructed well is helpful for both website users and search engines to orient themselves.

For example:

URLs should be clean, short, and descriptive of the page main idea.Click To TweetIt’s a good idea to include the main keyword for the web page in the URL, but never try to keyword-stuff (for example,

Another consideration are URLs that have tracking parameters on them. Please don’t ever do this on a website! There are many ways to implement tracking on a site, and using parameters in the URLs is the worst way to do this.

If a website is doing this today, you’ll want to go through a project to remove the tracking parameters from the URLs, and switch to some other method for tracking.

On the other hand, perhaps the URLs are only moderately suboptimal, such as this one:

In cases like this, I don’t think that changing the URLs is that urgent. I’d wait until you’re in the midst of another larger site project at the same time (like a redesign).

6. URL Redirects

It’s a common best practice to ensure that a web page that no longer needs to exist on a website be redirected to the next most relevant live web page using a 301 redirect. There are other redirect types that exist as well, so be sure to understand the various types and how they function before using any of them.

Be sure to redirect pages that no longer need to be indexed in search to more useful pages.Click To TweetGoogle recommends that you use 301 redirects because they indicate a page has permanently moved from one location to another, and other redirects, such as a 302, are used to signal that the page relocation is only temporary. If you use the wrong type of redirect, Google may keep the wrong page in their index.

It used to be the case that much less than 100 percent of the PageRank transferred to the new page through a redirect. In 2016, however, Google came out with a statement that there would be no PageRank value lost using any of the 3XX redirects.

To help check redirects, you can use tools like Redirect Check or

Redirect check

7. Meta Tags Review

Each and every web page on a site should have unique title tags and meta descriptions tags—the tags that make up the meta information that helps the search engines understand what the page is about.

Make sure every page on your site has unique title and description tags.Click To TweetThis gives the website the ability to suggest to the search engines what text to use as the description of its pages in the search results (versus search engines like Google generating an “autosnippet,” which may not be as optimal).

It may also help avoid some pages of the website from being filtered out of the search results if search engines use the meta information to help detect duplicate content.

You’ll also want to take this opportunity to check for a robots metatag on the pages of the site. If you find one, there could be trouble. For example, an unintentional “noindex” or “nofollow” value could adversely affect your SEO efforts.

8. Sitemaps and robots.txt Verification

It’s important to check the XML Sitemap and robots.txt files to make sure they are in good order. Is the XML Sitemap up to date? Is the robots.txt file blocking the crawling of sections of a site that you don’t want it to? You can use a feature in the Google Search Console to test the robots.txt file. You can also test and add a Sitemap file there as well.

9. Image Alt Attributes

Alt attributes for the images on a website help describe what the image is about. This is helpful for two reasons:

  1. I. Search engines cannot “see” image files the way a human would, so they need extra data to understand the content of the image.
  2. II. Web users with disabilities, like those who are blind, often use screen-reading software that will help describe the elements on a web page, images being one of them, and these programs make use of the alt attributes.

It doesn’t hurt to use keyword-rich descriptions in the attributes and file names when it’s relevant to the actual image, but you should never keyword-stuff.

10. Mobile Friendliness

The amount of people that are searching and purchasing on their mobile devices is growing each year. At Stone Temple, we have clients who get more than 70 percent of their traffic from mobile devices. Google has seen this coming for a long time, and has been pushing for websites to become mobile friendly for years.

Because the mobile device is such a key player in search today, at the time of writing, Google has declared it will have a mobile-first index. What that means is that it will rank search results based on the mobile version of a website first, even for desktop users.

One key aspect of a mobile-first strategy from Google is that its primary crawl will be of the mobile version of a website, and that means Google will be using the mobile crawl to discover pages on a site.

Most companies have built their desktop site to aid Google in discovering content, and their mobile site purely from a UX perspective. As a result, the crawl of a mobile site might be quite poor from a content discovery perspective.

Make sure to include a crawl of the mobile site as a key part of any audit of a site. Then compare the mobile crawl results with the crawl of the desktop site.

It is now essential for an SEO audit to include a mobile crawl of your site.Click To TweetIf a website doesn’t have a mobile version, Google has said it will still crawl and rank the desktop version; however, not having mobile-friendly content means a website may not rank as well in the search results.

While there are a few different technical approaches to creating a mobile-friendly website, Google has recommended that websites use responsive design. There’s plenty of documentation on how to do that coming directly from Google, as well as tools that can help gauge a website’s mobile experience, like this one.

It’s worth mentioning Google’s accelerated mobile pages (AMP) here as well. This effort by Google is to give website publishers the ability to make their web content even faster to users.

While Google has said that AMP pages won’t receive a boost in ranking at the time of writing, page speed is, however, a signal. The complexity of the technical implementation of AMP pages is one of the reasons some may choose not to explore it.

Another way to create mobile experiences is via progressive web apps, which is an up-and-coming way to provide mobile app-like experiences on the web via the browser (without having to download an app).

The main benefit is the ability to access specific parts of a website in a way similar to what traditional apps can.

11. Site Speed

Site speed is one of the signals in Google’s ranking algorithm. Slow load times can cause the crawling and indexing of a site to be slower, and can increase bounce rates on a website.

Historically, this has only been a ranking factor when site speeds were very slow, but Google has been making noise that it will become more important over time. Google’s John Mueller has also indicated that a site that is too slow, and which is nominally mobile-friendly, may now be deemed as non-mobile friendly. However, currently, mobile-age speed is not currently treated by Google as a ranking factor.

Site speed will become increasingly important as a search factor. Are you ready?Click To TweetIn fact, site speed has become such an important element of the overall user experience, especially in mobile, that Google has said it wants above-the-fold content for mobile users to render in one second or less.

To help people get more visibility into site speed, Google offers tools such as the PageSpeed Insights tool and the site speed reports found in Google Analytics.

12. Links

Here, we’re looking at links in a couple different ways: internal links (those on the website itself) and external links (other sites linking to the website).

Internal Links
First, look for pages that have excessive links. You may want to minimize those. Second, make sure the web pages use anchor text intelligently without abusing it or it could look spammy to search engines. For example, if you have a link to the home page in the global navigation, call it “Home” instead of picking your juiciest keyword.

Internal links are what define the overall hierarchy of a site. The site might, for example, look like this:

perfectly structured site

The site above obviously has a well-defined structure, and that’s good. But in practice, sites rarely look like this, and some level of deviation from this is perfectly fine.

A home page may link directly to some of the company’s top products, as shown in Level 4 of the image, and that’s fine. However, it’s a problem if the site has a highly convoluted structure that has many pages that can only be reached after a large number of clicks if you try to navigate to them from the home page, or if each page is linking to too many other pages.

Look for these types of issues and try to resolve them to create something with a cleaner hierarchy.

Make sure your site has a clean internal link structure. Click To TweetExternal links
Also known as inbound links or backlinks, you’ll want to perform an analysis to ensure there aren’t any problems there, like a history of purchased links, irrelevant links and links that looks spammy.

You can use tools like Open Site Explorer, Majestic SEO, Ahrefs Site Explorer, SEMRush, and the Google Search Console/Bing Webmaster Tools accounts to collect data about links.

Personally, I like to use all of these sources, collect all of their output data, dedupe it and build one master list. None of the tools provides a complete list, so using them all will get you the best possible picture.

Look for patterns in the anchor text, like if too many of the links have a critical keyword for the site in them. Unless the critical keyword happens to also be the name of the company, this is a sure sign of trouble.

Also check that there are links to pages other than the home page. Too many of these are another sure sign of trouble in the backlink profile. Lastly, check how the backlink profile for the site compares to the backlink profiles of its major competitors.

Make sure that there are enough external links to the site, and that there are enough high-quality links in the mix.

13. Subdomains

Historically, it’s been believed that subdomains do not benefit from the primary domain’s full trust and link authority. This was largely due to the fact that a subdomain could be under the control of a different party, and therefore in the search engine’s eyes, it needed to be separately evaluated.

For an example of a domain that allows third parties to operate subdomains of their site, consider, that allows people to set up their own blogs and operate them as subdomains of

For the most part, this is not really true today, and search engines are extremely good at recognizing whether or not the subdomain really is a part of the main domain, or if it’s independently operated.

I still recommend using a subfolder over a subdomain as the default approach to adding new categories of content to a site., However, if you already have it on a subdomain, I would not move it to a subfolder unless you have clear evidence of a problem, as there is a cost to site moves, and the upside of making the move is not enough to pay that cost.

You should prefer subfolders over subdomains when structuring a new site. Click To TweetFor purposes of an audit, you need to make sure you include subdomains within the audit. As part of this, make sure your crawl covers them, and check analytics data to see if there is any clear evidence of a problem, such as it’s getting very little traffic, or recent material traffic drops.

For more on subdomains and their effect on SEO, see Everything You Need to Know About Subfolders, Subdomains, and Microsites for SEO.

14. Geolocation

For websites that aim to rank locally, for example, a chiropractor that’s established in San Francisco and wants to be found for “San Francisco chiropractor,” you’ll want to consider things like making sure the business address is on every page of the site, and claiming and ensuring the validity of the Google Places listings.

Beyond local businesses, websites that target specific countries or multiple countries with multiple languages have a whole host of best practice considerations to contend with.

These include things like understanding how to use hreflang tags properly, and attracting attention (such as links) from within each country where products and services are sold by the business.

15. Code Quality

A website with clean code that allows the search engines to crawl it with ease enhances the experience for the crawlers. W3C validation is the “gold standard” for performing a checkup on the website’s code, but is not really required from an SEO perspective (if search engines punished sites for poor coding practices, there might not be much left to show in the search results). Nonetheless, clean coding improves maintainability of a site, and reduces the chances of errors (including SEO errors) creeping into the site.


An SEO audit can occur at any stage of the lifecycle of a website, and can even be performed on a periodic basis, like quarterly or annually, to ensure everything is on the up and up.

While there are different approaches to performing an SEO audit, the steps listed in this article serve as a solid foundation to getting to know the site better and how it can improve, so your SEO efforts get the most ROI.

This article adapted from the book The Art of SEO: Mastering Search Engine Optimization (3rd Edition), Eric Enge lead co-author.


ASP.NET authorisation


I have seen so many people asking again and again how to give allow access to particular page to a person or roles. So I thought its good to put this in one place. I will discuss how to configure web.config depending on the scenario.

We will start with a web.config without any authorization and modify it on case by case bassis.

No Authorization

We will start with the root web.config without any authorization.

<system.web><authentication mode=Forms>

</authentication> </system.web></configuration>

Deny Anonymous user to access entire website

This is the case when you want everybody to login before the can start browsing around your website. i.e. The first thing they will see is a login page.
<system.web><authentication mode=Forms>


<deny users=?/> //will deny anonymous users </authorization></system.web>
The above situation is good when user don’t have to register themselves but instead their user account is created by some administrator.

Allow access to everyone to a particular page

     Sometimes you want to allow public access to your registeration page and want to restrict access to rest of the site only to logged / authenticated users .i.e. do not allow anonymous access. Say your registration page is called register.aspx in your site’s root folder. In the web.config of your website’s root folder you need to have following setup.


<authentication mode=Forms/>

<authorization> <deny users=?/>  //this will restrict anonymous user access</authorization>

<location path=register.aspx> //path here is path to your register.aspx page e.g. it could be ~/publicpages/register.aspx
<authorization><allow users=*/> // this will allow access to everyone to register.aspx


Till now we saw either allow users or to authenticated users only. But there could be cases where we want to allow particular user to certain pages but deny everyone else (authenticated as well as anonymous). 

To allow access to particular user only and deny everyone else

      Say you want to give access to user “John” to a particular page e.g. userpersonal.aspx and deny all others the location tag above should look like below:

<location path=userpersonal.aspx>
<authorization><allow users=John/> // allow John ..note: you can have multiple users seperated by comma e.g. John,Mary,etc

<deny users=*/>  // deny others</authorization>

Allow only users in particular Role
Here I am will not show how to setup roles. I assume you have roles managment setup for users. We will see now what needs to be done in web.config to configure authorization for a particular role. e.g You have two roles. Customer and Admin and two folders CustomerFolder and AdminFolder. Users in Admin role can access both folders. Users in Customers role can access only CustomerFolder and not AdminFolder. You will have to add location tags for each folder path as shown below:
<location path=AdminFolder>

<authorization><allow roles=Admin/> //Allows users in Admin role

<deny users=*/> // deny everyone else</authorization>


<location path=CustomerFolder>

<authorization><allow roles=Admin, Customers/> //Allow users in Admin and Customers roles

<deny users=*/> // Deny rest of all</authorization>


Alternate way – using individual web.config for each Folder
Alternative to above mentioned method of using tag, you can add web.config to each folder and configure authorization accordingly almost similar to one show above but not using location tag. Taking same eg. as above. Add web.config to both the folders – AdminFolder and CustomerFolder.

Web.config in AdminFolder should look like:


<allow roles=Admin/> //Allows users in Admin role<deny users=*/> // deny everyone else


Web.config in CustomerFolder should look like: 

roles=Admin, Customers/> //Allow users in Admin and Customers roles users=*/> // Deny rest of all

Images and CSS files

Say you have all your images and CSS in a seperate folder called images and you are denying anonymous access to your website. In that case you might see that on your login page you cannot see images(if any) and css(if any) applied to your login page controls.

In that case you can add a web.config to the images and css folder and allow access to everyone to that folder. So your web.config in images folder should look as below:

<authorization><allow users=*/> //Allow everyone


Common Mistakes

I have seen people complaining that they have setup their roles correctly and also made entry to their web.config but still their authorization doesn’t work. Even they have allowed access to their role that user cannot access particular page/folder. The common reason for that is placing before .

Say the web.config from AdminFolder as we have seen before is something like this:
//This web.config will not allow access to users even they are in Admin Role 


<deny users=*/> // deny everyone else

<allow roles=Admin/> //Allows users in Admin role</authorization>
Since the authorization is done from top to bottom, rules are checked until a match is found. Here we have <deny users=*/> first and so it will not check for allow any more and deny access even if in Admin role.

So PUT all allows BEFORE ANY deny.

NOTE: deny works the same way as allow. You can deny particular roles or users as per your requirement.

Update: Issue with IIS 7

With IIS 7 you will have to give access to IUSR Anonymous user account to your folder that contains your css or images files. Check resource below.

I hope this will answer some of the question regarding how to authorize pages / folders(directories).

Comments welcome.