2020 is not a fun year so far. On top of all the anxiety-inducing things going on in the world, it seems to be the year of Google updates.
We've seen two so far already — one major core update in January and a smaller one in February.
And Google claims again and again that the updates are par for the course, that they are doing them all the time.
So how do we make sense of all these updates? Are there consistent tactics to thrive in the ever-changing environment?
To help figure this out and not stress more than we need to, we asked a few bona fide SEO experts three questions:
- What are your steps immediately after the update's been announced?
- Have you noticed any correlations between Google core updates and top ranking factors?
- If you do see a noticeable drop in your client's rankings, what is your survival plan?
After each interview, I've put a little "key takeaways" section, highlighting the main points and providing links for further reading.
The head image for this round-up is also a little spoiler: the biggest advice we all can give in this trying time is: DON'T PANIC.
Nothing. Algorithm updates are like grizzly bears, stay still and play dead and they will hopefully leave you alone. If you react and start making changes they might end up biting your head off.
Google usually perfects the update over time, so reacting is not a good idea. Instead, focus on first principles. If you've been penalised, then there is a more fundamental strategy issue that you need to deal with — SEO tactics will not help you.
Aside from the usual links and content with the correct query intent.....no. It's getting harder and harder to pinpoint what's happening with large algorithm changes. I did have a recent conversation with SEO savant Arnout Hellemans, and he has a theory about the training of machine learning in the SERPs fundamentally changing the SERP layouts as they reset and retrain the ML over time.
On personal affiliate websites, the biggest swings I have seen are around fresh links. I have entrenched websites that have held rankings for very long times with solid historic link indexes starting to drop for head terms and be replaced by sites with smaller but fresher link indexes.
With backlink analysis it's so hard to tell what's happening as we can really only analyse one step removed from our websites, however, I think the way in which the link index is used is seriously changing. It's becoming an important initial signal (like a degree is when applying for a job) but once you are getting impressions, the value of links to a page starts to tail off.
Depending on the severity of the drop and the niche we look to diagnose the issue, evaluate the data points that support our hypothesis, change strategy to be more long term and less vulnerable to algorithmic movements and then build out the SEO tactics to get us there.
Our key takeaway from Ross:
- Matching search intent is important regardless of the update.
- Building fresh links might be the decisive ranking factor right now.
- Links in general have become a very important initial signal.
Google adjusts their core search ranking algorithm to give more relevant and useful search results to the searchers.
After the latest core updates, we focused on evaluating our existing content for quality and upgrading it with the latest statistics and strategies.
We upgraded many of our existing blog posts to ensure that they:
- Have a conversational tone
- Add value to readers and help solve their problems
- Include interrogative sentences to engage users better
- Provide data-packed insights and well-researched content
No, I didn't see any correlation with any top-rankings factors other than content quality. To rank higher in search results, you just need to focus on enhancing content experiences.
Content quality and its relevance to the search questions is the key factor of the latest Google core update. Google wants to provide valuable search results that can address the search questions and provide satisfactory information and answers to searchers.
So, the best way to survive Google's core updates and boost search rankings is to upgrade all content to meet searchers' requirements.
For this, my team and I regularly conduct a SERP analysis to identify key factors that can help us satisfy user intent. Then, we implement those points during content upgrades. This helps us create content that Google understands well and finds valuable for their users.
Our key takeaway from Shane:
- Always prioritize optimizing your existing content.
- Make 100% sure search intent is matched by your pages.
After an update is announced I typically do nothing at all! I try to only work with the whitest hat of all sites that should have nothing to fear from an algorithm update. In the following days I might check Google Search Console to see if there's any meaningful change, but usually my clients will actually benefit from an algo update rather than get hurt.
I don't focus on rankings as much as I look at overall search impression visibility. I do see correlation between impressions and algorithm updates, and rankings are somewhat of a factor in that.
As I don't check rankings, my first signal to an issue would be a drop in impressions. If I see something like this shortly after an algorithm update, I start to dig in.
In my opinion, sites don't get impacted by algorithm updates — only pages. I therefore will deep dive into the data to understand where pages or queries might have dropped and where others picked up.
Our key takeaway from Eli:
- Stay as white hat as possible and you'll be okay.
- It's pages, not websites, that get impacted by an update. So focus on page optimization.
Every week, we check the hundreds of properties we monitor to see whether there have been any notable trends in rises or falls in organic traffic. The goal is to try and pinpoint whether Google has made any significant algorithm changes that we should be analyzing further. The vast majority of sites we monitor are those we have conducted thorough assessments for, providing us with plenty of data to use to determine the potential correlations between sites that see notable traffic fluctuations on the same day.
When there is an announced Core Algorithm update, this is also where our analysis begins. Given the announcement, we have exact dates to work with, and that helps us perform a more in depth analysis. My personal strategy is to compare entrance data from Google Analytics for landing pages before and after the date (usually 1-2 weeks if possible) to assess the top 100 or so pages that have seen major changes in organic traffic.
I personally use entrance data of overall organic traffic. This is because internal traffic to a page after a user has arrived at, say the homepage, from organic search is going to show up as an organic visit in Analytics. Entrance data only shows landing page data from organic search, helping me see which pages were most affected. I then export this data and colate it with GSC query data for the same time periods to see what queries saw the biggest changes.
This is the starting point for me. From there we are digging into the similarities between the sites/queries that dropped. We look at what kind of page dropped and whether something notable changed in the SERPs. The investigation takes approximately a day and is followed by a team brainstorm on our findings.
Recently, we have been finding more and more that Google's Core Algorithm updates have been affecting sites that have broad site quality issues with the majority of our clients coming to us after these updates in need of recovery. It is harder to quantify these concerns into individual ranking factors, and there is rarely a 'smoking gun' to point to and say "Hey, fix this and you'll be back to normal!". Instead, it appears that with each new update Google adds or changes something within the algorithm that helps improve the likelihood that they will surface high quality content. This is particularly true for YMYL queries.
Our main game plan for helping clients diagnose and plan recovery from core algorithm hits is to perform a deep assessment of the quality of their site. This includes a technical audit, a thin content audit, page speed assessment, and a backlink overview to make sure everything is running smoothly — but the real meat of the review in these cases often comes from a qualitative assessment of the sites overall quality. We are looking for as many ways to help create a site that Google would, in all honesty, be embarrassed not to be ranking.
We have had a lot of success in helping to recover sites after Core Algorithm Updates using this approach. With that said, there are cases where we discover that a site is of particularly low quality, and was likely ranking before the Core Update through clever, grey hat SEO tricks, instead of actually being a top notch site. In these cases, we will focus on helping the site to diversify its traffic sources, and optimising the site and content to rank for more realistic search queries.
That's the easy part. The first thing we do is start going through client data (past and present) to see if there's any movement. The last thing that you want is to have a client notice movement before you do. We're also looking to see which if any sites had movement (up or down).
But as a fastidious search geek that is always seeking to "future proof" our SEO, we also want to see what might have happened, regardless of the affects. As such I will dig through a large number of sites I have access and/or track, to start to get a sense of what might be going on. This can be through Search Console, Analytics, various tools we track clients and competitors on.
Well, we all know that correlation isn't always causation, thus it can be taken with a grain of salt. The next problem tends to be that we have no idea what's been changed and how many elements have been affected. As such I generally look at a broader strategic approach. Have the gains/losses been affected by the overall SEO strategy. Remember, I track not only clients, but their competitors.
As such, I look for trends across a given market/query space. Are they content-heavy strategies? On-site focused strategies? Link building strategies? I look for characteristics of the SEO approaches, not a specific (so-called) 'ranking factor'.
Interestingly I don't often seem to have much in the way of losses with client projects. I guess I just don't play too close to the edge. That being said, I do a fair bit of 'forensic' (traffic loss) consulting work. In those situations I tend to use a process that's not far off from what I've written about previously.
It's the Sherlock Holmes approach; "Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth." One should never have a knee-jerk reaction. It's a process of elimination. Given that none of us have the keys to the black-box that is Google, we can never know what's going on definitively.
And the real trick is how one implements a recovery plan. Let's say you've identified 3-4 possible solutions. Well, you can't just bulk-fix things. The inherent problem is that if 2 of 4 have a positive effect, and 2 of 4 have a negative affect; you end up with a net-effect of 0. As such, we need to implement them one at a time… let it percolate for a while (7 days tends to work) and then try the next one. Sure, clients hate that… they want it fixed NOW… but it's just the way it has to be.
First: breathe, don't panic. In most cases, it will take a while for the results to be visible. Sometimes — actually much more often in recent updates — there is no clear trend except for relevance. However, I would lie if I said I didn't check Google Analytics and Search Console for any massive changes.
Well, that's hard to say because I don't keep track of all ranking factors. I don't even know what all the ranking factors are :). However, I do notice that Google seems to turn some factors up and down with recent updates. I've noticed, for example, that links seemed to have received way less weight in one core update and then it was returned to baseline in the next one.
Get super specific in the drop analysis. I look at desktop vs. mobile, query syntax, time, pages, log files, crawl errors, etc. It's important to understand what exactly is going on and not enough people do that. Once I know what's happening, I react accordingly. Sometimes, there's nothing you can do. We had a couple of updates in which we lost organic traffic. We did nothing. The traffic came back. Sometimes, Google just experiments.
Our key takeaway from Kevin:
- Keep calm. Remember that sometimes there really isn't anything you can do except wait.
- Perform in-depth website audit and look at as many factors as possible.
Every niche is affected differently after Google update. Stay calm and follow the right resources on the internet you will see that each of them has a different take. Just analyse them a bit and if you see any pattern then look into this in more detail. Sometimes there is unnecessary noise around the Internet. In most cases it is best not to do anything but aggregate some data and most importantly share the knowledge in private Facebook groups for SEO consultants.
Yes everytime there is a Google Core update I see that traffic that was driven by the top keywords is being affected, however temporary after two weeks almost everything comes to the default further known as correction. After the Bert update that was designed to handle tasks such as entity recognition, part of speech tagging, many websites have seen significant decreases in rankings, specifically the content written purposely for product reviews — this type of content is almost always for affiliate marketing purposes.
Some adjustments in rankings might actually attract Google's bot to crawl and reconsider your website. Wait, don't do any changes and exclude all possible old Google updates and get help from more than one expert.
Our key takeaway from Milosz:
- In search, there's a lot of misinformation, so make sure to separate fact from myth (for an example of how to do that, check out our investigation into Google's top ranking factors).
- Don't rush to do anything. First, see if your website is abiding by all of the older major updates' rules.
- Try joining a private Facebook group for SEOs and share your knowledge there.
At LSEO we wait for the impact of the update on data that we monitor on behalf of our clients — we do not take immediate action. We also look for insights from experts on SEO industry websites like Search Engine Journal and Search Engine Land. We leverage private forums and we review what other experts are saying in terms of impact.
Absolutely. Google Rankbrain is Google's machine learning algorithm — with each update Google is iterating core ranking factors to improve the search experience for its users. Google continues to get more sophisticated with how it understands links and evaluates the quality and usefulness of content. It also is getting better at looking at on-site signals, such as page speed, time on site, and other types of interactions to determine rankings.
Since we only use best practices at LSEO, when we see an update our clients tend not to be impacted much. Instead, we tend to win new clients who didn't follow best practices and are now paying the price and looking for a survival plan.
That said — we are always looking to diversify the strategies we use to maximize the positive impact on Google's top two ranking factors — links and content. Furthermore, we do our best to address technical SEO factors as a priority, such as indexation, site speed, and mobile friendliness so that our clients are able to not only survive Google updates, but benefit from them.
Our key takeaway from Kris:
- Get acquainted with current Google algorithms like RankBrain since Google will continue working in this direction.
- Read multiple expert opinions (through round-ups like ours, but also from sources like Search Engine Land) and look for trends.
- Concentrate on backlinks and content, and you'll be able to thrive, instead of getting hit.
Google algorithm updates happen on a daily basis, but once in a while, some updates are more significant than others. You should aim to be one of the first to know about an algorithm change, so you can act fast. For that reason, it's necessary to stick to these three rules:
- Track your keywords and look out for ranking fluctuations with tools like Semrush, Algoroo, Accuranker, RankRanger.
- Follow industry influencers who tend to share the latest information related to continuous updates.
- Set up notifications for your most valuable keywords so you can monitor them on a daily basis. You should be notified immediately after any unexpected changes.
If you do notice any fluctuations, check your organic traffic and keyword positions to see if your website has been affected. If you notice any significant fluctuations in keyword wins or losses, you should analyze your subpages separately, allowing you to diagnose the extent to which you were impacted.
In most cases, your service won't generally be affected. This more granular approach will help you to distinguish the best and worst-performing subpages.
Thanks to this, you may be able to notice a pattern relating to the last update, these patterns can involve internal linking, topic relevancy, etc.
Remember, this isn't a time for panic, you shouldn't make any rash decisions. The more time you spend gathering data through analysis, the more informed your decisions will be.
After each update, I usually spend a few hours immersing myself in Surfer's SERP Analyzer, charts, and datasets. It helps me to determine any new dependencies and correlations within a particular set of SERPs.
The comparison of SERPs from the time before and after Google's update is my secret in the process of making more informed decisions.
I've noticed a few common occurrences among websites that have seen a significant drop in rankings. Usually, the affected domains have at least one of the following issues, related to:
This occurs when the purpose of a website isn't relevant to the content and the overall marketing strategy.
In this case, the company's blog or FAQ section contains pages with thin content and doesn't provide any additional value to the user.
Poor quality content should be avoided and is something that Google always emphasizes.
Short, irrelevant, and keyword-stuffed content can have a negative impact on the rankings across a whole domain.
Running ads on your website isn't a bad thing to do. In some cases, the correct placement of ads can even increase the trustworthiness, time spent on the website, and impact the conversion rate positively.
However, the overuse of ads can be harmful to both user experience and ranking signals.
If the content of a page consists only of plain text, it will decrease readability and provide a poor user experience. Visual elements like images, charts, tables, videos, infographics help the user to understand even the most complex issues.
If an article is just a huge wall of text, it will not look as inviting to users either.
It's also important to Implement a proper header structure, as well as separating the textual content on relatively small paragraphs to help generate better UX signals.
In my opinion, the two most recent Google updates were associated with on-page elements like content, website's structure, quality, and relevancy.
Content marketing should also play a major role in every SEO strategy.
Well-planned, great quality content can attract a lot of potential customers but it's also important to remember that poorly-written and low-quality articles can also be potentially harmful.
If your website has been impacted by one of the latest updates from Google, I'd advise taking the following actions:
- Analyze the most recent activities that you made related to SEO for your website. There is a possibility that you weren't affected by the update, but by the most recent changes that you've made to your website. You might even want to consider reverting those changes.
- Make sure the update is completely rolled out. The propagation can take several days to roll-out, take this time to analyze your data and see if you've actually been impacted.
- Analyze the most affected industries and websites. Try to find correlations in other websites' structure, contents, or backlink profiles.
- Compare to your website. Pay special attention to the articles NLP coverage, internal linking, posts relevancy and complementarity, or a potential user experience problem.
- Prepare and implement a strategy to regain those positions and any traffic you've lost. Keep in mind that reversing a loss in rankings, can be a long process.
- + Consult with the SEO community. If your website was affected, and there is no apparent reason, ask within your SEO network for advice. In some cases, having a fresh pair of eyes and an impartial opinion will be helpful in diagnosing and solving the problem, enabling you to get your website back on track.
Focus on quick wins at the beginning and divide your long-term wins into smaller steps.
Our key takeaway from Slawomir:
- It's a great idea to track SERP history.
- Follow SEO influencers.
- If you find that certain industry is more affected than others, analyze it and draw conclusions about best practices to implement.
So, nothing shockingly new for us SEO experts. It seems that the most valuable advice is to stick to white hat strategies, while the two ranking factors cited the most are content and backlinks.
You want to:
- Match search intent,
- Improve and optimize your content,
- Get fresh backlinks, and
- Keep calm!
Other than that, remember to read SEO-oriented websites like Search Engine Land and this very blog, follow key influencers and don't rush to rework your entire website because you lost some traffic — you might get it back sooner than you think!
By: Oleg Triers