What To Do When Links Aren’t Moving The Needle

Last updated: December 6, 2021

Maybe this is a situation you’ve found yourself in, you’re building links like all the SEO experts recommend but they’re just not having a visible effect on your rankings. 

It’s a pretty frustrating situation to find yourself in, you’re investing more into the site but not seeing any positive returns. It can be hard to keep faith but we all know that SEO isn’t a short term game like it used to be, often the actions we take now are going to have an effect on the website months down the line. 

But if you find yourself in this situation there are some things that you can do to make sure you’re giving yourself the best chance of ranking and to make sure you’re not burning your money on links. 

We have to remember that while the signals passed from links is certainly a heavily weighted factor in the Google algorithm there are many other important factors that need to be taken into account. Such as, on-page SEO, topical coverage and authority, technical SEO, user metrics and many other factors. 

I’ve always believed that your website is as strong as its weakest link for the most part, so if your website is a 10/10 in on-page, link profile but only a 7/10 in technical your website will only be as strong as the 7/10 in technical. So your main focus should be on bringing the overall website quality up together rather than just focusing your efforts on one part of the equation. 

One thing to bear in mind is that SERPs are being treated on a SERP by SERP basis, the technical SEO of a web hosting SERP is dramatically different to a local service SERP. So you need to take this into consideration when scoring your website. You need to look and see what the top competitors are scoring rather than an arbitrary score that is being recommended for all websites. 

This is something that myself and Daniel Cuttridge spoke about when the core web vitals update rolled around. While overall most SERPs seemed unaffected by the update there were some SERPs where the weighting factor seemed to be higher and a more important factor overall. 

 

Note: This is something that I’ve realised more over the past 18 months or so, some of our clients are so in tune with their niche because they spend all day in the niche that their observations are super valuable, more so than generic advice when it comes to dealing with their website. 

For example, one of our clients we are way more aggressive with exact match anchors because we both noticed all their competitors were doing so with great success. When we started to replicate this our results skyrocketed. If we did this in other niches I would expect the exact opposite. 

I think that this is something that we will notice over the coming years that the SEOs specializing in a smaller amount of niches will outperform those that are focusing on a lot of niches. A lot of the general advice that I see reported on in update reports often contradicts what we’re seeing in specific niches, we spend a lot of time in more grey niches like casino and cannabis which might be why. But I feel we would be a step behind if we were just following the general advice that is being put out there.

So what should you look at when links don’t seem to be moving the needle? 

It goes without saying that you need to make sure that you’re building the right types of links. If this isn’t something that you’ve been doing for a while this can be quite a daunting task in itself. 

I would recommend you start off by reading this article I wrote which talks about how third party metrics and traffic are being manipulated just to sell guest posts and that these metrics alone can’t determine the value of a link. 

My personal favourite way is to evaluate the A.R.T. of the links. 

  • Authority
  • Relevance
  • Trust

Authority

Authority is essentially how much power the link has in the eyes of Google. Since Google hasn’t updated Page Rank in a long time this is something that we personally use other third party metrics to analyse such as Ahref’s Domain Rating and Moz’s Domain Authority. But you need to understand how these metrics work and the ways in which they can be manipulated, as I mentioned above. 

Relevance

This essentially boils down to how relevant is the domain and the individual page you’re receiving the link from to your page. The closer the relevance of the domain and the page to the topic of the page you’re linking to the more relevant the link is. 

So for example, a domain and page that is all about casinos linking to a casino review page is more relevant than a technology domain that links from a page that is talking about the technology that runs online casinos. But this is far more relevant than a link from a local butchers website with a niche insertion on their homepage which links to the casino review page. 

While this is subjective I think that common sense applies, does it make sense for the link to be there. In the second example of the technology domain, while it might not be perfect there’s only so many domain relevant links that you can get before you have to move onto URL relevant links (links where the relevance is from a page level opposed to the domain).

Trust

This is another subjective measure as there’s no definitive way to determine the trust of the domain. But the way that I “measure” trust is by looking at the backlink profile of the domain, does this domain have referring domains from authority sites that Google might have on it’s internal seed list? These would be large, trustworthy websites that have a reputation for providing high quality information over a long period of time. So these would be links from websites such as Wikipedia, Healthline, The NY Times, Techradar, The BBC and websites like these. 

The other thing I like to look at is whether the website has a brand presence, this can be through organic search or even through social. But is there a good chance that the website is receiving a good amount of traffic through direct sources other than just pure SEO. If you look at the websites I mentioned above they have huge brand presences and I don’t think that’s a coincidence. 

Are The Long Tails Moving?

This is something that we see very often when speaking with other SEOs. They might be building links to a page that the main keyword is hyper competitive like “buy CBD oil” and this is their only measuring stick for success of their link building. They only track these bigger keywords. 

When you’re going after hyper competitive keywords you might not see progress of the main keyword for months at a time because the amount of links required to move the needle could be so large. 

Take for example the current SERP for buy CBD oil. All of these domains have serious domain power, with the lowest DR in the SERP being DR69 but also they have a lot of referring domains going to the ranking page as well for the most part (with the exception of ecowatch which only has 24). 

If we look at the ranking inner pages, excluding the homepages that are ranking there are an average of 750 referring domains to these inner pages. While I am sure that a good amount of these aren’t high quality links and include web2.0s if you’re a new site building 5 links per month to your inner page it will take some time to see significant movement and this doesn’t even take into account the overall domain strength of these domains. 

So I think it would be wise to start tracking more keyword variations, longer tail ones. There’s a few ways you can do this, you can either use your intuition by looking at the page but personally I prefer to look at Ahrefs and Google Search Console and see the keywords that they’re picking up for the individual page you want to track. 

To do this in Google Search Console I just use the page filter option, adding in the URL of the page here and selecting lower competition, but still relevant keywords. 

So looking at Google Search Console these might be some good options of keywords to track, they’re much smaller than the main keyword buy CBD oil but if you start seeing positive movement for smoke CBD or best CBD products there’s a good chance that you’re on the right track to see movement for the bigger keyword of buy CBD oil. 

To do this in Ahrefs, I would just open the URL of the page in the site explorer with the selector on the exact URL of the page you’re looking at.

Then look through the organic keywords looking for less competitive keywords that you can track the progress of. Again if the smaller keywords are moving in the right direction there’s a good chance that the bigger ones will come with time. 

Are You A Topical Authority?

When you look at your competitors’ site, do their websites cover the entire’s buyer’s journey in more depth than you? Do they have the full funnel from top of the funnel to the bottom better than your website? 

Content and topical coverage are definitely important parts of ranking in Google nowadays. While there is some weird stuff in the SERPs where very powerful domains such as news websites rank without any real supporting content, if you’re not one of these powerful domains it probably doesn’t apply to you and topical coverage is likely very important. 

One of my favourite ways to look at a website’s topical coverage is by looking at their top pages in Ahrefs and their sitemap if the site is small enough and manageable enough to digest. 

I like to use Ahrefs top pages as this gives you an idea of whether the pages are generating substantial traffic.

I will literally go down this list looking and seeing if my website covers these topics and marking whether it should, if it’s not. 

But the reason that I also use the sitemap is just because a page doesn’t receive traffic (according to ahrefs) doesn’t mean that it shouldn’t be covered for topical coverage. Sometimes the page might not receive search volume but for the topic to be completely covered it should be included. 

So once I have done this I will export the sitemap, using Steve Toth’s sitemap trick and remove the URLs that I have already checked from Ahrefs. Sometimes you’re left with a lot of URLs other times Ahrefs has picked up the majority. 

But I will then do the same again, work my way through the URLs asking myself if I’ve covered this topic and if the answer is no should I have?

If you find yourself with a lot of articles that you should have covered but haven’t then maybe topical coverage is what’s holding back your site. 

How Are Your User Engagement Metrics?

User engagement metrics such as bounce rate and time on page can be good indicators to assess the quality of your page. While there certainly are going to be some pages where the time on page isn’t going to be much due to the nature of the page. For example, if you have a page that is just answering how tall is the Eiffel tower and that’s all the page answers you can’t expect them to remain on the page too long. 

But if you have an in-depth guide that’s multiple thousands of words long and the time on page is under 1 minute you probably have a problem. 

It’s hard to set an exact guideline but consider the intent of the page and the amount of words that the page has. How long would you expect the user to be on this page? Does it seem alarmingly low? 

I’ve seen first hand how bad user metrics can ruin a site over time. I had a site that we were using for content testing, using a very cheap content method. Initially it was great but over time the poor quality content caught up to the site. This can be a slow killer of sites due to the machine learning over time seeing that this content isn’t rewarding the users search because they’re bouncing off the page and not spending time on the page. 

If you find yourself looking at a lot of pages which have high bounce rates and low time on page then maybe this is what’s holding your site back. It would probably be worth auditing your content writing strategy, looking at your writing and editing process because the end users aren’t finding the content valuable. 

Is Your On-Page SEO Good Enough?

It goes without saying that on-page SEO is still important. 

Are you structuring your pages in a way that both Google and the user is able to tell exactly what the page is about? 

Making sure things like your:

  • H tags
  • Title
  • Schema
  • Content optimisation (e.g. true density / keyword density)
  • Internal linking

 

I am a big believer in getting the basics of on-page right and applying it across to all the pages and ensuring that you stay on top of it. 

Don’t get me wrong there are people out there like Daniel Cuttridge who will be able to get a lot more out of on-page than I will be able to but I think the basics go a long way. 

Is Your Technical SEO Up To Scratch?

So this is probably one of the harder ones to write up what you should be checking as there are so many things that could be going wrong on a technical level. 

 

Some can really slip under the radar such as hosting issues and complications. Daniel Cuttridge identified that a large host that many SEOs use, Siteground, had been blocking Googlebot on Gary Wilson’s and Michael Landau-Spiers’ site; this is a very rare issue but had dire consequences for their site. I know having spoken with both of them that Daniel was the only person who picked this up and once resolved the site started flying. 

 

So with technical SEO being such a broad topic there are a few areas that I would initially consider but this is by no means exhaustive: 

  • Server performance
  • Indexation
  • Page speed
  • Page issues (i.e. things that can be picked up with an auditor)
  • Security issues
  • Website structure
 

Server Performance

Honestly this is something that goes way over my head, but this would cover the type of issue that I mentioned above for Gary and Michael’s site. As with most things it’s best to know your strengths and weaknesses and reach out to people who are better than you to help on topics that aren’t your strengths. 

I’ll be honest, this is an area that not many people really seem to talk about. I’m lucky enough to be good friends with Daniel and could reach out to him if I had any inclination that I had a server error but I realise this is a unique position to be in as he doesn’t really offer any service that would cater to this. I would recommend checking out his guide to the best web hosting for SEOs to get a basic understanding of what to look for in your hosting performance. In this guide is discusses some basic checks such as looking at Google Search Console’s Crawl Stats report, which can identify some of the issues that you might be encountering. 

Indexation

I believe that if your site is losing pages out of the index or the pages are taking significantly longer to index this is a cause of concern. 

I’m not sure if this is something that everyone will agree on me with but I’ve seen it enough times that a site is performing well and indexing all it’s content within 24 hours of posting and then suddenly indexation starts to slow, taking longer and longer to get new pages into the index and then suddenly the overall site performance takes a harsh dip. 

I’m not sure what exactly causes this but it’s something that I believe should be monitored. This has been made harder due to Google’s indexation issues that we’ve been seeing probably over the past 2 years now but for the most part if you see pages dropping out of Google’s index and indexation slowing down I think that this is a cause of concern and it could be due to a multitude of factors.  

Page Speed

Again, I really don’t think that page speed is the be all and end all but I do think that you should at least be at the standard of your competitors and if possible a bit ahead of them. One of my favourite ways to check is to look at some of my main pages’ keywords and look at the average for the SERP. 

It’s important to consider if your site is in multiple niches that you should probably check for multiple keywords, for example one of our client sites covers a broad range of affiliate topics, cryptocurrency, web hosting and agency softwares. In this case the benchmark for pages speed for each of the subtopics might be different and therefore, if you only checked on you might be under performing when compared to the other two subject areas. 

When it comes to checking page speed I like to use this tool by Reddico.

All you do is enter your country, keyword and your page URL and it compares your page to the SERP. 

If you score significantly worse than your top competitors I would take the time to improve this so you’re at least higher than the average but as you can see from the Rank vs. Speed Score above in this SERP there is by no means a correlation between page speed scores and rankings. There are a lot of other higher weighted factors at play. 

Page Issues

When I talk about page issues I’m talking about the types of things that typically could be picked up by an auditing tool like Sitebulb, Screaming Frog or Ahref’s auditing tool. The types of issues that are common are 404 pages, broken internal links, canonical issues, missing titles and H1s etc. 

If there are a lot of issues like these there’s probably a lack of overall care going into the site and while some are more important than others these can have a large effect on the site. For example, when working in the US CBD niche we found one of our clients had canonicals that were canonicalising the pages which were 404ing rather than self-referencing canonicals. When we fixed this error the main keywords jumped around 20 positions from page 4 up to middle of page 2. Considering how competitive these terms were this was a huge improvement for such a simple change. 

Security Issues

This is something that I have again seen go unnoticed, if your site is riddled with security issues there’s a good chance that Google is going to hold it back, why would it want to send its user through to potentially harmful pages. 

On one of my own sites I was working on the partner of the site noticed that random profanities were being inserted into the content. This was likely due a server level issue but either way there was a security issue with the site and we felt like it could be holding the site back. 

If you have pages being created and put into the index that you didn’t make, if things are changing on your site without you doing so etc. then there’s an issue that needs to be fixed. Just because there’s no warnings about these security issues doesn’t mean that Google isn’t aware and hampering your SEO efforts because of it. It’s hardly the greatest trust signal.

Website Structure

Is your website set up in a way that makes sense both for users but also for Google’s crawling? Inefficient structures can hold back a seemingly good site because clearly defined topical structures can’t be seen by Google. It may look like you have a lack of supporting content when in reality you have a lack of structure to make this clear to Google. 

Take the time to look at your categories and internal linking and compare to what the winning sites are doing. Although, it’s always best to test this yourself and find what performs the best, I know it’s easy to say this for everything and it can be very time consuming to do so but at the end of the day following generic advice, versus specific data driven choices will hold you back in my opinion. 

Are You Out Pacing The Competitors?

Sometimes the problem is that you’re just never going to outrank the competitors at your current pace. You’re building 5 links per month while your competitors are building 20 per month, you’re effectively moving backwards rather than forwards. 

SEO is always a moving goalpost, just because when you did your competitor analysis 6 months ago your competitor had 200 referring domains doesn’t mean it will be the same now. You need to always be aware of what competitors are doing, with some of our campaigns we increase link velocity by up to 300% in the short term to stimulate growth. If you’re not paying attention and don’t notice this a big gap can appear without you realising. 

This isn’t to say that 1 link is always equal to 1 link, you might be able to rank with less links especially if you’re out performing in a lot of other areas but it’s unlikely you’ll outrank a site with 2,000 high quality referring domains with a site that only has 100 referring domains. Overall domain and page level strength is too powerful of a ranking factor at least in the current SERPs that we’re active in. 

Are You Being Patient Enough?

This is one that people never really like to hear, but when you’re working on a project day in day out it can feel like it’s taking forever to see significant improvement. But if you look at the top results in the SERPs you’re competing for you’ll often see that the sites which are in the top positions have been working on this for a minimum of 2 years, often longer. 

SEO is the result of cumulative signals over time, you can’t force things to happen quickly. In some niches this is even more true, I would say that this is something that we have noticed in Your Money Your Life (YMYL) niches that there are even longer “sandbox periods.” This is probably because it takes longer to build up the trust signals that are required in these niches which can have a direct impact on the quality of the reader’s life. 

The Bottom Line

There’s a lot of things that go into SEO and while links are an important part of this if your links aren’t moving the needle there’s a lot of areas which you need to assess to see why your site isn’t progressing. 

It’s unfortunately very easy to get lost in trying to find out what is the cause for the lack of movement but due to the huge amount of factors it’s not always easy to pinpoint. If you go through this guide and there’s a clear issue that you can see then I would be thankful because while it might be a pain to sort the issue at least there is a clear issue rather than not having a clue what is causing the lack of progress. 

Hopefully this article has been helpful in helping you assess what might be wrong with your site if links just aren’t moving the needle.

SHARE THIS ARTICLE

OTHER BLOG POSTS