Imagine a situation. You want to get your site into a SERP for a query that will likely bring you clicks and profit. But then you look at this SERP and see only industry moguls there:
Sure thing, these super powerful websites are very, very hard to outrank. Especially if your own site is new and relatively small.
So, here I am to tell you: don’t give up, there’s a way out, and it is not a paid promotion.
I’m talking about the underdog SEO strategy.
Underdog SEO is an optimization strategy that lets smaller websites outrank strong competitors and get their pages featured in highly competitive SERPs without spending a fortune on paid advertising.
In underdog SEO, the question is not how to but what to. You need to find the weakest page in the SERP you’re interested in, dig deeper into that competitor’s SEO, and then make your page better.
So, let’s get down to business.
First of all, you need to carefully study the SERP you want to get into. This will help you better understand your competitors.
Most likely, you already know that they are powerful, but it is not enough. Simply googling your keyword will not give you any insights either. So, to get more important details, I suggest using Rank Tracker and its SERP Analysis feature in particular. And that is what I am going to do in my example.
1) Launch Rank Tracker (or download the tool if you haven’t used it before) and create a project for your website.
2) Go to the SERP Analysis module, choose a preferred search engine, and enter the keyword in question. In my case, I am going to analyze the SERP for lotion, which seems to be a highly competitive keyword:
3) Hit the Analyze SERP button and look at the results:
4) Here, pay attention not to the top-ranking competitor but to the correlation between ranking factors and SERP positions. To investigate this, click the drop-down menu near the blue graph to see what factors have the highest correlation level:
5) In the perfect scenario, you need to focus on the factor showing a moderate or high correlation level. But SEO is an unpredictable thing, and you are pretty likely to see something like I did — all the ranking factors show a low level of correlation with the SERP position. In this case, pay attention to the Domain InLink Rank metric.
6) Now look at the list of the top 10 competitors. If you see any factors with moderate or high correlation, focus on the competitor who has the lowest value of that high-correlation factor. If there’s low correlation everywhere, as it is in my case, find the competitor with the lowest Domain InLink Rank:
As you can see, this SERP has two websites that are obviously weaker than the rest. As they are almost equal in their “weakness”, I will proceed with the highest-ranking one of these two, https://plumandashby.co.uk.
Why focus on the “worst” competitor you may ask? The answer is simple. If a relatively weak website, an underdog, gets into the highly competitive SERP together with Amazon, then you can do this, too. Outranking that weaker site will be much easier.
Now that the "victim" is found, let's move further.
To better understand the strengths and weaknesses of that underdog website you’re going to outrank, you need to see how other pages of this site perform in search. Do any of them rank high as well? Do they win any SERP features?
To figure this out, you will need to find all the keywords these URLs rank for. Rank Tracker’s Ranking Keywords module will help us here.
1) In Rank Tracker, go to Keyword Research > Ranking Keywords and enter the domain under consideration. Specify search engine location if necessary (recommended in most cases). Hit Search.
2) Look at the results, paying attention to the pages ranking in the top 10:
3) Look carefully at the # of Searches column.
Download Rank Tracker
Keywords with 0 searches do not require your attention, so you can omit them in your research.
4) Now focus on the Rank column:
Look at the icons in it — they show what SERP features the page wins. Depending on the feature, the page is likely to have a relevant Schema Markup that helps Google understand content better.
5) It’s a good idea to visit the top-ranking pages and investigate them manually, paying attention to the type of page and its content.
Having super-relevant backlinks may be the reason why a page from a relatively weak website has managed to get into a highly competitive SERP. That’s why backlink profile investigation is the next step of this underdog SEO strategy.
To quickly find all the necessary data on a website’s backlinks, we’ll need a powerful backlink analysis tool. SEO SpyGlass will be a good solution.
1) Run the tool and create a project for the website you’re investigating.
2) Go to Backlink Profile > Backlinks to see all the backlinks the site has.
3) Filter the backlinks to see only dofollow ones (nofollow links do not pass PageRank). Then, look at the domains linking to the website. Pay attention to those with the highest Domain InLink Rank:
4) In the Quick Filter bar, enter the URL of the page you’re investigating, i.e. that weak one that got into the highly competitive SERP. Look carefully at the domains linking to it.
Now the tricky part. It may happen that the page under consideration does not have any backlinks at all, just as it is in my case:
If you see something like this, too, then it is not backlinks that help the page rank. Nevertheless, you will not lose anything if you build some links for the page you’re trying to get to the SERP. So trying to reach out to the most powerful backlink providers of that competitor is a good idea anyway.
E-E-A-T (Experience, Expertise, Authority, and Trust) signals are the criteria Google uses to estimate the quality and relevance of a website in a specific niche. E-E-A-T is signalized by the quality of the content, the proven expertise of the author, and, of course, by links and linkless mentions from trusted sources.
As you can see, E-E-A-T signals are very close to backlinks. So it pays to carefully investigate each domain that gives a dofollow link to that underdog site you're exploring and planning to outrank.
Pay attention to linking sites themselves — if they are famous enough, big, niche-relevant, etc. In my example, I’ve found a backlink from The Telegraph, a top news website in the UK. As the site I am studying has a UK domain, I can assume that a backlink from The Telegraph matters a lot:
If the linking domain does not seem super famous, but the linking page is written by some identified author, pay attention to that personality. Find out if he or she is a professional and in what area. Additionally, check if the author has taken part in any conferences, interviews, or podcasts.
Sometimes linkless mentions also play a big role in the world of E-E-A-T, so try social listening tools like Awario to see who discusses the brand in social media, and what is the context of these discussions.
Nothing new here actually. If your goal is to outrank a certain page from a SERP, then your content has to be better than the one of that very page.
For starters, visit the page in question and look at its content carefully. Maybe you’ll see something your page lacks at a glance, like images, FAQ sections, specific information, etc.
In my example, I don’t have much to observe: a page I am exploring is a typical category page of an ecommerce website, and it looks just as it should. Nothing special:
So, here comes the next step of content analysis, which is more practical. It will require a powerful content audit tool. I’m going to use WebSite Auditor, and here’s what I (and you) will do.
1) Launch the tool and create a project for the website you’re going to analyze.
2) Enjoy a glass of wine while WebSite Auditor is collecting all the information about the website — it can take a while depending on how many URLs the site has.
3) Go to Page Audit > Content Audit, and enter the URL you’re going to outrank and the keyword in question (lotion in my example). Look at the content factors and how well they are optimized:
In my example, everything looks pretty good except for keyword stuffing in secondary headings. But, as you remember, this page is a category one with a bunch of goods listed on it.
So, in this case, the keyword stuffing “issue” is probably inevitable. To ensure that I am right, and not lazy, I will investigate this factor and look at other competitors in that SERP. I click that factor (it has a Warning status), and switch to the Competitors table:
After manually investigating the pages with the biggest Keyword count value (i.e. those with potential keyword stuffing issues), I saw that most of them were actually category pages. So, in this case, I just have to put up with it.
4) To get more insights into the page’s content, let’s move to the Content Editor section.
Download WebSite Auditor
Here you can see a detailed list of keywords detected on the page, keyword suggestions, topics covered on the page, optimization rate score, etc.
5) To help your writers (in case you don’t write on your own) create better content based on what you’ve just found, you can download a PDF file with content recommendations for that page and pass it to your content team.
Once you’re ready with content for your own page, you can test it using Google NLP API. The tool will give you a better understanding of how Google sees your page and what it thinks it is about. Run the competitor’s page content through the NLP API, too, and compare the results.
Pay attention to the colored labels you see near the words:
The more colored entities Google identifies, the more meaningful your page is for search engines. If the gray OTHER label prevails, then consider adding more meaningful entities (words).
If your site is small and obviously lacks authority and backlinks, make sure it is flawless in terms of technical SEO. Here, you have to be not just better than the weakest competitor but the best of all.
To audit your website, use WebSite Auditor. Create a project and go to Site Structure > Site Audit:
Investigate each factor having an Error or Warning status. WebSite Auditor will show you the full list of affected pages and provide tips on how to solve any trouble with your SEO.
I’ve come up with this step while researching the page I’ve been working with. This page (https://plumandashby.co.uk/collections/hand-body-lotion) ranks second in Google but has a low Domain InLink Rank, low Page Strength, and no backlinks at all. The website is built on Shopify (like, nothing special in coding), and some links don’t even work properly. So how did it manage to outrank a giant like Amazon?
The answer turned out to be simple — the origin of the domain helped a lot. My browser location is set to the UK, and the page is under the .co.uk domain.
So, I can see that Google favors local domains depending on the searcher’s location.
What’s the lesson? Keep in mind the region where you want to rank, and think about this at the stage of domain registration. If you’ve already missed that opportunity, but still want to become the name in the specific location, consider doing some local SEO and try to build enough connections to local entities on your site.
Once you have discovered why that underdog page succeeded in the SERP full of giants, go and make your page a better one.
Update your content or create new one, build relevant backlinks, work on internal linking for better PageRank distribution within your site, enhance mobile optimization, take care of user experience (PageSpeed metrics), etc.
Remember that SEO is a long-term game, so do not expect your page to take a top position the day after you apply changes. What’s more, watch Google updates — they may (and they do) cause SERP fluctuations and unexpected changes, so adapt your strategy to the changeable search environment.
Sometimes, you do not have to be the best to succeed. What you need is to be better than the worst among the best. This is much more real and much less time-consuming.
So, did you ever manage to get into a SERP full of Amazons and Wikipedias? Or maybe you even outranked them? Share your stories in our Facebook community.