Weekly Wisdom with Ross Tavendale: Technical SEO Framework

Ross Tavendale

Sep 24, 20197 min read
Technical SEO Framework
Youtube video thumbnail

Modified Transcript

Hello everyone and welcome to another Weekly Wisdom. In today's video, we are going to be exploring a topic that is very close to my heart. We are going to be looking at how to structure and build a strategy around a technical SEO audit.

I'm sure you've all taken my site audit course on the SEMrush Academy already. But now that you know how to use all the tools, we need to work out the strategy behind how to implement and also what order do we do the implementation. Today I'm going to dive into exactly how we go about doing technical SEO implementation strategies, the "Type A" way. This is our agency's way of doing things. Other people in other agencies will have different ways of doing it. This is just the one that suits us best. So without further ado, let's get into it.

Framework Overview

Let's go over the technical framework. We deliberately split it into a bunch of different areas. Hygiene is fixing really basic things that the crawler needs to physically access your website. Organization, are your content silos properly organized? Page power, looking at internal linking in the flow of page rank. Website presentation, do you have the correct meta information? Is schema up to date? Languages, are you actually getting the right SERPs pushing you? If you've got multiple languages, are you showing up in the German SERP, in the UK, and in the USA? Security and code, any mixed content, HTTP/HTTPS stuff. Performance is things like speed.

Audit Framework

And that is the order that we typically do things in. If we fix stuff like hygiene, organization and page power early, we are allowing Google to go through the entire site, see everything that is correct and actually start to understand what is physically on the page— and then we can get into more meaty stuff.


When it comes to hygiene, what does that mean? First and foremost, I want to do a little bit of data collection. I want to see all our server errors. I want to see all of our redirect chains and also want to have a little look at the robots file just to make sure that nothing's disallowed or nothing weird is going on there. Inside SEMrush site audit, this is visible under the "Issues" tab. And if you look at all of your errors this is where you are going to find a lot of things. For example, here we have a bunch of 404s, we have a bunch of daisy chain redirects, things of that nature. Typically they would come under the errors and these are kind of high priority things we need to fix.

SEMrush Site Audit

[ create-campaign bg_images="https://static.semrush.com/blog/uploads/media/9f/4e/9f4eebb206604bf5c4b6b367c17d9e0a/banner-site-audit.jpg" bg_button="-success" header="Check your website health" text="with SEMrush Site Audit"]


Think of the organization as the way in which the website is put together. Even if you have got a technically perfect website but it is a completely flat structure with no content silos, it is going to be very hard for you to start ranking for a bunch of long-tail things. Let's say you sell microphones. If every single product, such as "blue microphone" and "condenser microphone", and each type of brand are just example.com/product, it si not really good. But if it is things like example.com/type/brand/product, we are getting them into nice silos. It is good for the user because if they see domain name then brand then the product, that is quite natural to click into something in search.

Same when it comes to the actual search volumes themselves, you should not give them the same precedence and closeness to the root of the domain. It is weird to have one big massive term that gets 60,000 searches a month, then a tiny little term that gets a hundred searches a month, but they are both hanging off the root. That is what we mean by organization.

Orphan Pages

You will also need to look at things like orphan pages. An orphan page is essentially something that is not linked to anywhere on the website. You can find these in your site maps and in GA where these pages do exist, but it is just physically impossible to click to them. The way the bot is going to be finding them is through external links. Not ideal, we want to fix that. Also, pay attention to Canonical problems. If you have got an eCommerce site, this is really common.


I appreciate that rel=next /prev aren't really a thing anymore. What I mean with pagination is if you have got a series of products and you need to click to next pages in order to get to them. If Google has to cycle through all of that, if it is loaded with JavaScript, a lot of the products may be invisible. I want to see how your pagination physically works and make sure that it is actually surfacing everything correctly and we can see all of the pages on the site.

Page Power

This is a really basic one, but it gets left a lot of the time. It means looking at things like remapping all of your internal links. Let's take the SEMrush website for example, for the term "SEO". 

If all of these pages on SEMrush are not linking to their main SEO landing page then we want to go and add an internal link. Because we can see that these pages are what Google deems is the most powerful and most relevant to that keyword. So we should probably use it to internally link. It is a bit of a basic way to do internal linking, but it certainly is a good gate.

UX and Hierarchy

We also want to fix broken links and just look at your UX and your hierarchy. Now, when I say UX, I don't mean "go annoy your designers and annoy your UX people and get them to change everything". It is similar to what we were talking about before with content silos. Having flat structure is really not good from a search point of view, so making sure that everything is properly organized is a big one.

It can happen in some popular out of the box content management systems like WordPress, Magento, or FreeCommerce. Such sites have a blog section, they have a blog landing page, but then when you click on to an article, it is just example.com/article-name, which is obviously wrong. We want it to be properly categorized.


I think a lot of you will be very, very familiar with this. This is essentially looking at your metadata; not just your titles and descriptions, but also at your schema. So go to schema.org to check all the different types of schema that you can put into your website. If you are using something like WordPress, Yoast does a lot of this stuff out of the box. You can put it in through tag manager using JSON-LD which is nice and handy. And use Google's structured data testing tool to make sure that your schema is actually working and validated and viewable on the website.

Structured Data Testing Tool


Next up would be your languages. It's really basic. If you have got multi-language, multi-country, audit any hreflang problems. Just make sure to use correct country and language codes cause a lot of people can get tripped up there. For example with a British English or UK English, a lot of people would do en-UK as their subfolder to control all the UK content; it is actually wrong. The correct code is en-GB. Double-check to make sure that you are using the correct code for your language and your country variant.


Speed is something that we usually leave right to the very end because it is such a big deal to make any change within an organization. We brief it right at the start, but we don't expect them to do anything about it for at least nine months because it is such a fundamental change they need to make to their website.

We are not as bothered with AMP and things like that unless it is maybe a publisher. Because a publisher's AMP is obviously massive; speed tends to get left to the end. We run something called the Google Lighthouse which you can do inside of Google Chrome itself. Open a page, press F12, bring up a Lighthouse Audit, and then click "Run audits". It runs a basic speed audit and starts to understand when the first content will paint, at what point does the site actually become usable, time to interactive is also in there. And it also shows your critical rendering path. SEMrush did really well for this, and you can start using this to understand if you are doing well or poorly.

Lighthouse Audits

Security and Code

Lastly, but definitely not least, is the security and code. This is when we are looking at the HTML, we are looking at the CSS and JavaScript. Typically if you have got a site on WordPress or are using a template, or it is powered by a lot of plugins, there is going to be tons of conflicting code, tons of CSS, tons of JavaScript that just doesn't get used, page to page.

The one I see most commonly is if you are using any sort of Google Maps plugin, for some reason it puts it in the template level and loads it on every single page regardless if it has a map on it or not. Double-check to make sure that you're not firing code needlessly; that is going to really slow down the site and really damage the performance.

Thank you for watching this week's Weekly Wisdom. If you have your own tech audit strategy, I'd love to hear about it in the comments there and below, but until next time we will see you later.

Author Photo
Ross is the Managing Director at Type A Media, an independent search agency that work with FTSE250 companies and mid-sized brands to help them find the optimal way to talk to more people online. When not obsessing over his clients rankings, he hosts the Canonical Chronicle, a weekly web show watched by 100k people every month. If you want to ask him a direct question you can find him @rtavs on Twitter.