Competitive Backlink Analysis A Quantitative Approach

Last updated: November 3, 2021

It’s been 18 months since my original backlink analysis guide was released and while a lot has remained the same I have tweaked many aspects of the methods I wrote about then. 

So who is this guide for? 

Usually I try to pinpoint exactly who will benefit from what I am writing about but in this case I think anyone who is spending on link building could benefit from this guide.

I have received DMs from everyone from major Amazon affiliates who used it to estimate how much it would cost for them to enter particular niches through to in house SEOs of 8 figure valued corporations who used my templates to justify their link spend to their managers.

Part of the beauty with SEO is that nowadays we have so many tools and so much data available that we can reverse engineer what our competitors are doing and we don’t have to enter a niche blind. We can see if there is a realistic chance of being able to rank in a certain niche.

Note before reading the article:

None of this is to say that links are the only ranking factor that matters and that you can’t rank for terms without hitting these requirements but this is going to help inform you whether your chance of ranking is realistic or not. This article is just covering one aspect of off-page SEO. 

If you are sending links without any results maybe you need to consider whether it is links that are holding your site back or other aspects of SEO.

Throughout this article I am going to use Ahrefs, this is my tool of choice. 

You can try to convert this over to whatever metric you are wanting to use, whether that’s Moz’s DA or Majestic’s TF or SEMRush’s Authority Score. All of these metrics can be useful but they all have flaws, Ahref’s DR is by no means perfect but it’s what I am used to working with. 

Since we’re speaking about Ahref’s a lot in this article I think it’s always important to mention that Ahref’s DR works on a logarithmic scale.

What this means is that unlike a linear scale, 2x DR45 backlinks do not equal 1x DR90 backlink, the gap between a DR30 and DR40 backlinks is nowhere near the gap between a DR80 and a DR90 backlink.

The Entry Point To The SERP

So this is the first thing I am going to do when looking at a target keyword, I used to look at the top 20 results to try get a general feel of what is on its way up but I noticed that often the weaker but highly relevant sites would be in position 11-16 and struggle to break onto page 1. So now I put the majority of my focus on just the top 5 positions as these are going to take around 60% of the organic clicks.

I identify the highest and lowest DR sites in the SERP and note their positions. 

I then ask myself these questions: 

  • Does there seem to be any DR correlation? Do the higher DR sites rank above the lower DR sites? 
  • Where is the lowest DR site? Is it sitting in the top 5 positions? When I see a low DR site sitting in these positions I love it because it confirms it is possible in my mind. 
  • How stable is this SERP? Is there a lot of volatility and movement in rankings? 

In an ideal world I will use a rank tracker like SERPWoo to track the rankings and movements for a while but this isn’t always possible so I make use of Ahref’s historic ranking graphs.

Why? It would suck to base your analysis around a site which has only just entered the SERP and is just as quick to leave. I tend not to spend too much time on sites which have had a huge traffic spike in a short period of time – they’re interesting to try see what has caused it but I prefer to look at sites that have had consistent growth over a long period of time.

  • What’s the average DR of the SERP?

The average isn’t to say that you have to hit this average and can’t be the new lowest DR in the SERP, someone has to, if it can be you, awesome. But it’s an easy way to represent the domain power in 1 number. 

If you’re working with clients I personally feel it can be easier just to show them the average. Rather than explain there is a DR10 site in the top 5 surrounded by DR60s and for them to decide that they want to be the new DR10 site in the top 5 and have them decide you’re a terrible SEO if you can’t do this because someone else clearly can. It can be easier to say the average is DR is 51, it’s lower due to the fact that there is a DR10 in the mix but it’s much more realistic. 

Let’s run through an example, we’ll use this keyword for the next few sections so you can follow along. The keyword we’re going to use is best water filter, it’s a competitive keyword with 23,000 monthly searches. 

Usually I am using Google search to make sure that the results are accurate and export them then running them through the batch analysis tool but for this example it’s easier to use keyword explorer. 

To export the SERP I usually use a browser addon, either SEO Quake to export to CSV or SEO Ruler to have a list to copy and paste. 

What Can We Learn From This SERP

The first thing I notice is that the lowest DR sites in the SERP is ranking 6th with a DR34 but this is a homepage.

Why does it being homepage make a difference? The reason this matters is that the homepage is the most powerful page of a website except in very rare cases.

So what does this mean? When I see that this is the lowest DR site in the SERP I could say the entry point to the SERP is minimum DR34 but unless I am going to try rank for the keyword with my homepage it’s probably better to go to the next lowest DR site which is DR51 in this case.

So the range is DR34-93 but the realistic range is DR51-93.

While there are strong sites in the SERP it doesn’t seem to be completely dominated by power, there isn’t a linear correlation with DR and rankings by any measures.

The average DR of the SERP Is 74.5 so with this and the information above you can see that it’s likely you’re going to need quite a lot of domain authority to enter this SERP. 

To put this in perspective here is some data from Ahrefs showing how many domains they have in each DR range and their average amount of dofollow referring domains.

I had a look at the volatility of the SERP and while the traffic bounces quite a lot in the top 5 positions the sites have been there a while and there has just been a battle between the sites for rankings. Because the keyword volume is quite high dropping from position 1 to 2 can make their traffic graphs seem quite volatile. 

The top sites have been active in the SERP for a while so they’re good to base your analysis off, CNET and The Prepared are quite new into the SERP, under 6 months so if they were in position 1 and 2 I would just take this into consideration as often rankings that come quickly leave quickly too. 

For example if we look at just The Spruce Eats’ history here it will look like it has dropped out the SERP between the 9th and 27th of August.



But if we go to the URL organic keywords history in site explorer you can see that it doesn’t drop out like it suggests above.



I have spoken with a member of the Ahrefs’ support team and they have said that this is due to migrating their database and that the keyword explorer has the freshest data.

Although this is a keyword that I have been following during writing this article and I didn’t see them drop out the SERP like this during that time period.

This is why I would prefer to have an independent tracker like SERPWoo for this data. My preference for SERPWoo in this case is because it tracks the movement of the SERP opposed to just the movement of one domain in the SERP.

Dofollow Domains To The Domain

Now we’re going to look at how powerful the domains in this SERP are, while the DR gives a quick overview I think that looking at the amount of referring domains can often help build the bigger picture. Plus it can help you see if the sites you are competing with have a lot of editorial backlinks or if they are holding their rankings up with lower powered guest posts or niche edits, which DR doesn’t tell you. 

We will continue with our analysis of the keyword best water filter. 

All I’m going to do is open up the URLs of the top 5 ranking sites in Ahrefs select to target the domain with all its sub domains and then go to referring domains and select dofollow from the drop down and then do a full export. 

So I’ve had some people ask why I only use do-follow and ultimately it’s because these are the links that are passing the link equity and power to your domain. It’s not that no-follows are useless for an overall SEO campaign but when it comes to analysing the SERP this is my rationale. Feel free to include no-follow referring domains in your analysis if you want but this is just how I do it. 

So, for this next step I have an excel sheet to speed up the process, you can download it here.

To use this sheet you just copy and paste your ahrefs export into A1 and it will tell you the amount of backlinks in each domain rating range.

Again, more data allows you to make a more informed decision but for the example I will just use the sites in 1-5. 

Note :

 I excluded the NY Times as I didn’t feel like it would bring any value to my analysis. If I was doing this for a SERP which had Amazon / Tripadvisor / Ebay / Yelp or a site like these I also don’t include them in this part of the analysis, I don’t think it’s useful to see that these sites all have a 500,000+ referring domains, they are often there based off their domain power alone. The huge numbers these sites have can really skew the averages. What you keep and remove really comes down to your discretion.

I am finding the overall domain RD analysis less useful than I was when I wrote the first version of this article due to the fact almost all Google updates have rewarded authority and power. So now the SERPs are so dominated by sites which realistically you won’t be able to keep up with.

I still do this phase of competition analysis but sometimes I end up also taking the average of the weakest 3 sites in the SERP and using this as the initial target to break into the first page. 

As I’ve mentioned above I don’t really believe that this domain analysis is as useful as it once was but I still keep it as part of my process as ultimately the time investment isn’t that large to do it and sometimes you can find some interesting information, especially when you are dealing with less competitive SERPS. 

When I am looking at this data I generally remove links below DR20 from my analysis, this isn’t to say these links are useless but the barrier to entry to reach DR20 is too low and often the power is too little from these types of links unless they are hyper relevant. 

You can see what I was talking about above about the power of the homepage versus inner pages quite clearly. Quality Water Lab looks completely out of place in this SERP but due to them using their homepage as a money page rather than a branded hub they are ranking in 5th for a competitive keyword they otherwise wouldn’t be able to. This isn’t to say this is a tactic I would personally use but it can be a good one when working on a limited budget, although if you are doing so for flip value your valuation will likely take a hit as investors see this as a negative.

If I add in the example of the article in position 10, The Prepared and consider this was an audit for them the story becomes a bit more interesting. 

If you have a look at their DR range counts they actually outperform HealthyKitchen101 in every DR range above DR20.

In my head I am thinking:

Is it a relevance issue? Does Google consider this a kitchen appliance SERP for health or is it a survival item? Probably a combination of both. HealthyKitchen101 likely has a lot more relevance around kitchen appliances from other pages compared to The Prepared.

Is it a page power issue? The page of The Prepared might not have enough power, even though the domain does.

Is it a non-link related issue? Links aren’t the be all and end all of ranking, I wish they were but they’re not.

We will talk about this in later sections but it’s interesting to see that in theory they have enough domain power but they are in position 10 versus HealthyKitchen101 in position 2.

How Powerful Is The Page?

So now we know how powerful the domain is but ranking doesn’t just come down to the power of the domain, it also comes down to a page level. 

Luckily, it’s very easy to use the same method that we just used to analyse the domain to analyse the URL. 

All we have to do is do the same process as above but target the exact URL instead of the domain. 

And repeat the above process, creating yourself a new table for the page level RDs. 

You will end up with something like this. 

I have again added in The Prepared in position 10 which can help paint the bigger picture on what we were talking about earlier. 

It’s clear that the page power could be part of the issue when we look at this data.

So when I look at this SERP now if I was working with The Prepared I would build some backlinks to the page and see how the site reacts as the actual page has no power, so it’s quite impressive that they’re even ranking in the top 10 of the domain power and relevance alone. 

You can start to add in other filters to see what they show, for example I quite often will do the count of domains above DR20 and what percentage of the links these make up. This makes the thought of having to match competitors link building a lot less intimidating usually.

As you can see on average only 49% of the domains are above DR20 so this effectively halves the amount of budget required as I find that you naturally begin to accumulate these lower powered links when you begin moving up the rankings for a high volume term like this. 

When I look at this data I notice that more power isn’t necessarily better as consumer reports have the 2nd highest amount of RDs above DR20 but are ranking 4th. The NY Times has the least amount of  RDs above DR20 and is ranking 2nd. This could be down to the sheer authority of the domain or a lot of other factors which we will talk about later. 

If I was doing link building for The Prepared I would think we’re already in a good position ranking from power and relevance to the term alone and would begin to build a handful of highly relevant and medium power backlinks and see how the site reacts.

I love it when I see a site ranking with no page power as often a few backlinks really can go a long way.

I would be thinking in my head that I might need to build over 50 referring domains over DR20 as the average for the SERP is 88, which sounds like a lot but you might not need anywhere near that many as the site is showing great potential without any live dofollow links to the page to begin with. 

It can sound intimidating but I would just break down the numbers of how many sales you think this would drive and how much that would earn you compared to your average link acquisition cost. Then if you consider your website an asset and think of the flip value opposed to the earnings each month it can really change your perspective.

Or you might decide that this isn’t a good target page as you have other pages which are showing more potential for less investment.

Link Velocity

So great you have a rough idea of how many backlinks you need to build now and how powerful they have to be.

One problem, that’s just a snapshot of today if you’re starting a brand new site today it might take a year to gain any real traction and at that point the landscape has changed. 

So how can you see how quickly your competitors built the links to their domain and their page. For the reasons we spoke about above in the referring domain analysis it’s important to consider not only the domain but also the page separately.

Luckily if you’re an Ahrefs user this got a hell of a lot easier with their csv export. 

So I’m going to quickly show you how I use this for the keyword “Online Casino” in the UK. 

The graph is a nice visual for showing clients but I find you can get a lot more out of the raw data set that you have from Ahrefs. You can manipulate it to see how quickly the link growth has been over the past month / quarter / year or whatever time period you want. 

The reason you need to look at these time periods is that the SERPS aren’t static and if you’re aiming to catch up with a specific site they aren’t just going to stay locked where they are. 

So to give an example of this: 

If you’re starting from a brand new site from 0RDs and you’re trying to catch up with a site that has 250RDs in 2 years. 

You could say that your target each month for link acquisition is 11 RDs as this would put you at 253RDs after 2 years. 

But this doesn’t take into consideration that they are building 10RDs per month, so in those 2 years you would essentially be the same distance behind, they would have 480RDs and you so you have a 227RD deficit. 

So if you took this into consideration in the beginning you would have realised that in fact you need to be building 21RDs per month and then after 2 years you would have 483RDs, a 3 RD surplus compared to the competitor. 

Here is what that looks like visually, you can see how the gap stays exactly the same if you don’t have this increased velocity.

This is a very simplistic view as you don’t need to always match exactly what a competitor has but on the flip side you also sometimes have to surpass what they have.

A weakness of this method for link velocity is that unfortunately you cannot select filters like sites under DR10 for example to be excluded, this data would be much more useful as quite often over a third of a site’s RDs are either spam or not very powerful in the first place. This percentage does massively vary though, so using the blanket rule of 30% would be a bad solution.

Other Link Things To Consider When Looking At Backlinks

Domain Relevance

If your site is niched down to one topic area and you are competing against more general sites then there is a good chance you will be able to enter the top 10 rankings with a less powerful domain. The reason for this is that all of the articles on your site are supporting pieces, you don’t suffer from relevance dilution.

What do I mean by relevance dilution? If we look at the search results for the keyword “best binoculars” you will see the two lowest DR sites in the SERP are Optics Addict and Best Binoculars Reviews.

As their names suggest, their sites are entirely about binoculars and optics. 

When you compare this to the likes of The Wirecutter (which is now hosted on the NY Times’ main domain but I only used the data from the /wirecutter) you can see that a tiny proportion of The Wirecutter is about binoculars compared to these other sites. 

While inurl:binoculars isn’t perfect it gives a quick overview. If you go and browse Optics Addict you will realise that while only 7.5% of the articles had binoculars in the URL a lot of their other articles are still supporting as they are about monoculars and gun optics which are closely related to binoculars. 

With that being said Google isn’t making it easy for very niched down sites as you can see from the SERP I used in the example. 

All of the updates over the past few years have ultimately benefited authority sites the most and smaller niche sites are getting bullied out the SERP by these broad, powerful authority sites. When I was writing the original article 18 months ago this wasn’t the case a lot of SERPs you would look at would have these smaller niched down sites ranking from sheer topic coverage. 

I personally like a middle ground of authority and how niched I go, you won’t find me registering sites like bestpowerdrills.com anymore as I think it’s too narrow and limiting but I might make a site targeting power tools only or tools in general.  

Link Relevance

This is the elephant in the room, none of this analysis takes into account the relevance of the links and this is a big oversight. I personally don’t think that relevance can be measured by UR / topical trust flow in a meaningful way so this needs to be done in a manual way. 

This isn’t something you want to overlook, lower DR but more relevant links can really move the needle.

You can see how I determine link relevance in this article. 

Is this to say that you wasted your time reading this? 

No, the power aspect of link building is still very important otherwise you could just build a huge network of really low powered, highly relevant sites and dominate the SERP. 

Tier 2 Backlinks

When you get backlinks from sites generally the pages that they come from are new and don’t have any power. The overall domain has power and the page has a little bit of power from the internal linking on the domain but that’s it. Think of the analysis we just did, you can see how we have broken down the power of the domain and the power of the page and how they both need to be accounted for. 

This is the same with your backlinks being built for your site. 

This can happen naturally, take for example if this was your post on Business Insider. You write an article on how to take a screenshot on Windows for Business Insider then it links to your website which sells software codes for Windows.

If this was your post it would have significantly more power due to the fact it has received 91RDs pointing to this post, which in turn links to yours.

While you won’t receive all the link equity from these links as it has to take another step to your website you will receive a certain amount.

This is known as Tier 2 link building. Tier 1 is sites linking directly to your website, tier 2 is sites linking so sites which link to you and so on.

This occurs naturally and massively increases the power of your backlinks being sent to your site.

SEOs have noticed that this increases the power and try to manipulate it by building their own tier 2 backlinks. The part where I worry is that most people are using PBN links and cheap niche edits. 

Some SEOs go many tiers deep and typically the spamminess of the links increases with each tier because there’s an added layer of separation from their website and the links.

The way I see this is a bit of a balancing act, in the short term this 100% works and you can see great results from it but I worry about the long term impact.

Often people see tier 2 link building as a way to send links that they usually wouldn’t send to their site but still get some positive benefits from it. But the way I think of it is that these sites are holding up the rankings of your site, you wouldn’t want to sit on a chair and have someone kick out the chair legs one by one. 

If their domain crashes and burns what is to say yours won’t, especially if you’re doing this for the majority of your links. I often see people using one PBN vendor for all their tier 2 links and the reason for selecting the vendor doesn’t usually come down to the safety of the PBN, it is usually a price decision. 

It all comes down to your risk tolerance, I’m not going to stand on my high horse and tell you what to do but we mostly build links for brands which are here for the long term. I can definitely see the appeal if you are just trying to flip an affiliate site because it definitely works when done correctly.

A counter argument is why don’t you just build high quality links for your tier 2 links, I would argue that you’re just diluting the power of your links and you should build them tier 1.

Internal Linking

The reason I mention internal links even though I said I wasn’t going to cover on-page is because if you have a super powerful page internally linking to the page you are trying to move up the SERPs it can pass a lot of power.

This is why sites like Gear Hungry have so many internal links from the homepage, the homepage is generally the most powerful of the site and therefore it spreads the link juice throughout the site efficiently.

A great example of this is The Future Gamer campaign. It’s a digital PR campaign that got some great traction.

These are the types of links that you would be over the moon with if they went to your website. While it’s not clear from the Ahrefs RD chart above the links started coming in the 7th of April and if you look at where these backlinks are pointing you will see all of it is directed just at URL of the future gamer campaign. 

So how do you get this new found link equity to be passed around your site effectively? Well in this case they linked to their homepage using the anchor “OnlineCasino.ca” as well as one internal page using the anchor “online gaming.”

And it worked, if you look at the actual domain the organic traffic and rankings is pretty lackluster. 

But when you look at the overall domain growth it’s pretty incredible.

Especially when you take into account that this is in the online casino niche and these include a lot of movement for money keywords.

In this case you would be able to tell as the ranking page is the homepage, so when you look at the overall domain you would see this but if the online casino page was an internal page and you didn’t take into consideration this internal linking you would be ignoring the huge amount of link equity coming from these editorial sites. 

I will often use a tool like Screaming Frog or Netpeak Spider to find the internal links when I am doing more in depth analysis but for this style analysis I will usually just use Ahrefs. Just make sure you have Ahrefs set to the specific URL you are wanting to analyse then select internal links from the left hand sidebar.

The link equity benefit also applies on the backlinks you are building. If you can get your backlinks to be internally linked to from other powerful pages they will pack more punch as the link equity will be passed from the external links going to those pages through the internal links to your article and then on to your site. It can be hard to negotiate this but can be well worth your time. I don’t usually look at this when doing my initial analysis I’ve outlined, if we are working on a project and can’t figure out for the life of us why we aren’t ranking then I would look into factors like this. 

A Million Other Factors

This really tends to depend on what camp you fall into but there are so many factors that are at play that you can just get stuck analysing and ultimately no analysis is ever going to take into account all the nuances at play.

Even if it did, the playing field is always changing with Google’s updates so by the time you get all the data together the algorithm will have likely changed.

The list of what people think influences whether a link is a good link is endless, some say traffic, others say referral traffic, some say social shares, some say tiered links. 

You get the picture. 

All of these things probably do play a role in some way or another.

The beauty of this guide is you can take what is useful, reject what is useless and add what you think is missing. I’m not telling you to do it this way, I just wanted to show you what has been working for me and open a conversation.

So if you have anything to add to the discussion please leave a reply to this article, I try to get back to all of them.

SHARE THIS ARTICLE

OTHER BLOG POSTS