Google Algorithm Updates Cheat Sheet

Google Algorithm Updates Cheat Sheet
Your ultimate guide to major Google penalties & algorithm changes

Since 2002 updates, Google has made its major algorithm changes explicit. And when harsh penalties rolled out with the first Panda and Penguin updates, it became clear that the company was serious about creating an ethical environment for user search.

In fact, Google updates take place every day and come mainly unnoticed. The company confirms only major algorithmic updates that are expected to affect search rankings dramatically. To help you make sense of Google's major algorithmic changes in the past years and major Google ranking factors, we've put up a cheat sheet with the most important updates and penalties rolled out in the recent years, along with a list of hazards and prevention tips for each.

But before we start, let's have a quick check whether any given update has impacted your own site's traffic. SEO PowerSuite's Rank Tracker is a massive help in this; the tool will automatically match up dates of all major Google updates to your traffic and ranking graphs.

1) Launch Rank Tracker (if you don't have it installed, get SEO PowerSuite's free version here) and create a project for your site by entering its URL and specifying your target keywords.
2) Click the Update visits button in Rank Tracker's top menu, and enter your Google Analytics credentials to sync your account with the tool.
3) In the lower part of your Rank Tracker dashboard, switch to the Organic Traffic tab.

Google algorithm updates are marked with the dotted lines on the Progress graph

The dotted lines over your graph mark the dates of major Google algorithm updates. Examine the graph to see if any drops (or spikes!) in visits correlate with the updates. Hover your mouse over any of the lines to see what the update was.

Note

Currently, the graph in Rank Tracker contains the dates of only major updates, namely Panda update, Pigeon update, Fred update, etc. But the graph is editable and allows adding those events that you deem relevant for your own tracking history. Simply right-click in the progress graph field and choose to add an event from the menu.

Did any of Google’s algorithm changes impact your organic traffic in any way? Read on to find out what each of the updates was about, what the main hazards are, and how you can keep your site safe.

 

15. December 2020 Core Update

Rollouts: December 3, 2020
Goal: Improve search results

On December 3, 2020 Google rolled out a broad core algorithm update, as it does several times per year. Many SEOs noticed SERPs volatility yet on December 1, though it might be unrelated. And the main question discussed in the aftermath has been: why Google rolled out such a massive update right on holiday shopping season?
The impact so far stays unclear, with various domains reporting equally many gains or losses.  Google guidance about such updates remains as covered before (please check E-A-T quality guidelines).

 

14. Core Algorithm Update

Rollouts: January 13, 2020, June 2019, March 2019, September 2019, January 2020, May 2020
Goal: Improve the way the search engine evaluates sites overall to reward those providing high-quality content

How to stay safe with Core Algorithm Updates

Actually, Google adds search algorithm changes somewhat daily and officially confirms only major updates that might affect SERPs significantly (which happen several times a year). Google confirmed its major core algorithm update on January 13, 2020, and one of the latest updates rolled out on May 4.

As a quick tip, you can guess that some update has taken place from the keyword fluctuation graph in Rank Tracker’s SERP Analysis.

Go to Target Keywords > Rank Tracking, click on the SERP Analysis tab in the lower screen, and switch to the Graph view. You need to click on the Record SERP data button for every project you'd like to see the SERP fluctuation graph (the feature is available only in paid version of Rank Tracker).

An average daily fluctuation is about 6.3%. Red spikes indicate that the SERP has changed significantly on that day, probably because of some algorithmic updates.

SERP fluctuations for ranking keywords
The red color in the graph means high SERP fluctuation for the keyword

Typically, there’s no strict explanation how exactly the Google algorithm changes may affect your site. That’s why there is nothing particular to fix quickly. The Google Webmasters advised us to focus on quality and provided a checklist on how to keep your site content in line with their guidelines. Briefly, sites should provide original, up-to-date high-quality content that brings real value to users.

Our recent survey among SEO experts revealed that with every new core update, people do nothing but monitor ranking stats. And if a site got penalized, they fix traditional SEO factors, with a focus on content and links.  Again, WebSite Auditor comes to your rescue explaining how to improve content and keep your site free from SEO errors.

 

13. Featured Snippet Deduplication

Launched: January 22, 2020
Goal: Prevent URLs in featured snippets to appear twice on the first page of organic search results

Hazards

  • A change in traffic because of moving search ranking positions.

How to stay safe with Featured Snippet updates

Google rolled out this change on January 22, 2020 which caused a bit of a fuss with SEOs that’s been finally systematized here.

What used to be called a position zero in the search engine results is now the first position. It’s up to you whether you want to get into the featured snippet or want to appear somewhere among 2 to 10 results. This ranking algorithm update has occurred one-time and affected 100% of results.

It still makes sense to monitor what pages are ranking with rich snippets and for what keywords. You can use the Rank Tracker tool to record a history of SERPs and to receive an alert when your site gets into a featured snippet for a certain keyword. In the Target Keywords module go to Ranking Keywords > Ranking Progress tab, and switch to the Rank Progress > Rank History tab.

 

12. BERT

Launched: October 25, 2019
Rollouts: December 9, 2019
Goal: Helps the search engine to understand context, especially in spoken queries

BERT is a deep-learning technique created for natural language processing: the Google algorithm is set to interpret natural language and understand context based on pre-training from a body of text on the web. This technique inside the ranking algorithm was particularly needed since the spread of voice search assistants. Google named BERT as “one of the biggest leaps in the last five years”.

Google rolled out its first BERT update for the English language, and then for over 70 other languages. Earlier reports said it had affected 10% of all search queries.

Hazards

  • Exact-match short-tail keywords
  • Unnatural and formalized language

How to stay safe with BERT

Actually, there isn’t much to be worried about BERT, the safeguards are the same as those applied with Hummingbird and RankBrain. Just provide high-quality content that answers user queries in the most proper form.

The search algorithm was aimed not to get sites penalized, but rather to boost the ones that fit better user queries. Are there any tweaks here? First, use long-tail research to explore your users’ intent. Second, examine the top 10 on the SERPs in order to understand user intent for those keywords, and what your competitors do to have their pages rank in top.

1. Use keyword research to identify user intent.

Go for long-tail keywords with Related Questions research method, including People Also Ask and Question Autocomplete forms. You’ve got to research more conversational keywords. Optimize your pages so that to meet users’ quest behind those keywords as fully as possible. It is recommended to use natural language, for example, in the form of questions in a FAQ section. This raises your chances to compete for top search rankings and even a featured snippet.

2. Do TF-IDF analysis to compare the top 10.

To comply with BERT and other major algorithms related to natural language, you’ve got to improve your page's scope and relevance. The best way to do it is with the help of the Content Editor and TF-IDF analysis tools that are integrated in WebSite Auditor. Certainly TF-IDF is a bit old language model, I mean it works differently from BERT. However, it can be a good clue for your optimization. It allows you to extract keywords by analyzing top 10 of your competitors and collecting keywords that they have in common.

Move to Content Analysis > TF-IDF, enter the URL to analyze in the Pages search bar in the upper left corner, and see how your keyword usage compares to competitors' on the TF-IDF chart. The tool will provide a suggestion if you should use more or less of the keyword in your text.

TF-IDF Analysis to get tips for keywords to optimize your pages with
 

11. Medic Core Update

Launched: August 2019
Goal: Improve search results by more appropriate match with user intent, especially for businesses dealing with people’s wellbeing (health, fitness, and finance)

Hazards

  • Pages optimized for keywords with mixed user intent.

Google Medic update was named this way when SEOs saw what types of sites were affected by these Google algorithm changes – mainly health and fitness websites, but not only. Since it was rolled out in the first week of August, a heavy flux was noted in search rankings for business sites, as well. Google did not go much into details, saying it was just a core update, and no fix for it was needed. The most shared guess is that the change was aimed to match pages with their types of intent.

How to stay safe with the Medic Update

When Google rolled out this update, the affected sites had to take really hard efforts to recover. Since that point, concerns of user intent have become more relevant than ever. Just have in mind three types of queries that imply a certain intent: informational (what/where/how), investigational (compare/review/kinds of) and transactional (order/get/buy). You’ve got to dive a bit into long-tail keyword research to find out what users are looking for on your pages, and optimize them accordingly.

Use Rank Tracker’s Keyword Research module to mine your best keyword opportunities. Once you have them all collected in the Keyword Sandbox, use filters to group them according to intent and other important metrics, then add to your Keyword Map where you can map each keyword, or a group of keywords, to a certain landing page. Use tags and notes to mark which pages are transactional and which are informational.

Download SEO PowerSuite

 

10. Speed Update

Launched: July 9, 2018
Goal: Boosts faster mobile pages

Google began updating its search algorithm towards mobile experience long before, so what makes this one special? Google officially confirmed that, starting with speed algorithm updates, page speed became a ranking factor for mobile pages.

Hazards

  • Slow-loading mobile pages

How to stay safe with Speed Update

First, check your mobile pages for load speed. For this purpose, use the PageSpeed Insights from Google, Lighthouse, or Mobile Friendly Test. You can find the latter integrated in Website Auditor software.

When you spot slow page issues, you can find the cause by looking deeper into your files with a Website Auditor check. Go to Pages > All Resources > Internal Resources, and filter them by Speed and Server Response Time. Here you will see what resources might need resizing, compression etc.

Audit site resources that might affect negatively your site speed
Check the resource size, server response and other stats

Besides, the latest of Google's updates concerned the Core Web Vitals, a bunch of user experience factors that are going to become part of the ranking algorithm somewhat next year. Loading speed and user interactivity is among them. So, it certainly makes sense to prepare for future Google algorithm changes by improving your site speed and user engagement.

 

9. Fred Update

Launched: March 8, 2017
Rollouts:
Goal: Filter out low quality search results whose sole purpose is generating ad and affiliate revenue

Fred algorithm got its name from Google's Gary Illyes, who jokingly suggested that all updates be named "Fred". Google confirmed the update took place, but refused to discuss the specifics of it, saying simply that the sites that Fred update targets are the ones that violate Google's webmaster guidelines. However, the studies of Fred-affected sites show that the vast majority of them are content sites (mostly blogs) with low-quality articles on a wide variety of topics that appear to be created mostly for the purpose of generating ad or affiliate revenue.

Hazards

  • Low-value, ad-centered content
  • Thin, affiliate-heavy content

How to stay safe with Fred

1. Review Google's guidelines. This may seem a tad obvious, but reviewing the Google Webmaster guidelines and Google Search Quality Guidelines (particularly the latter) is a good first step in keeping your site safe from Fred algorithm penalties.

2. Watch out for thin content. Look: New York Times, the Guardian, and Huffington Post all show ads — literally every publisher site does. So it's not the ads that Fred algorithm targets; it's the content. Audit your site for thin content, and update the low quality, low-word-count pages with relevant, useful information.

To start the check, navigate to the Pages module in SEO PowerSuite's WebSite Auditor and look for the Word count column. Now, sort the pages by their word count by clicking on the column's header to instantly spot pages with too little content.

But remember: short pages can do perfectly fine for certain queries. To see if your content length is within a reasonable range for your target keywords, go to Content Analysis and select the page you'd like to analyze. Enter the keyword, and hang on a sec while Google examines your and your top ranking competitors' pages. When the analysis is complete, look at Word count in body. Click on this factor and see how long the competitors' pages are.

Text length matters for rankings
Page Audit provides the word count in text body

To see each individual competitor's content length, click on Keywords in body and switch to Competitors. Here, you'll get a list of your top 10 competitors for the keywords you specified, along with the total word count on each of these pages. This should give you a solid idea on approximately how much content the searchers are looking for when they search for your target keywords.

 

8. Possum Update

Launched: September 1, 2016
Rollouts:
Goal: Deliver better, more diverse results based on the searcher's location and the business' address

The Possum update is the name for a number of recent algorithm changes in Google's local ranking search filter. After the Possum update, Google returns more varied results depending on the physical location of the searcher (the closer you are to a certain business physically, the more likely you'll see it among local results) and the phrasing of the query (even close variations now produce different results). Somewhat paradoxically, Possum also gave a boost to businesses that are outside the physical city area. (Previously, if your business wasn't physically located in the city you targeted, the search engine hardly ever included it into the local pack, it was hardly ever included into the local pack; now this isn't the case anymore.) Additionally, businesses that share an address with another business of a similar kind may now be deranked in the search results.

Hazards

  • Sharing a physical address with a similar business
  • Competitors whose business address is closer to the searcher's location

How to stay safe with Possum

1. Do geo-specific rank tracking. After Possum, the location from which you're checking your rankings plays an even bigger part in the results you get. If you haven't done this yet, now is the time to set up a a search filter with a custom location to check positions from in SEO PowerSuite's Rank Tracker. To get started, open the tool, create a project for your site, and press Add search engines at Step 4. Next to Google (or Google Maps, if that's what you're about to track), click Add Custom. Next, specify the Preferred location (since Possum made the searcher's location so important, it's best to specify something as specific as a street address or zip code):

Rank tracking across any search engine by country
Search filter with custom location

You can always modify the list of the local search engines you're using for rank checking in Preferences > Preferred Search Engines.

2. Expand your list of local keywords. Since Possum resulted in greater variety among the results for similar-looking queries, it's important that you track your positions for every variation separately.

To discover those variations, open SEO PowerSuite's Rank Tracker and create or open a project. Go to the Keyword Research module and click Suggest keywords. Enter the localized terms you are already tracking and hit Next. Select Google Autocomplete as your research method.

This should give you an ample list of terms that are related to the original queries you specified. You may also want to repeat the process for other methods, particularly Google Related Searches and Google Trends for even more variations.

Download SEO PowerSuite

Overall, with the Possum update, it's becoming even more important to optimize your listings specifically for local search. For a full list of local ranking factors and how-to tips, jump here.

 

7. RankBrain Algorithm

Launched: October 26, 2015 (possibly earlier)
Rollouts:
Goal: Deliver better search results based on relevance & machine learning

RankBrain is a machine learning system that helps Google better decipher the meaning behind queries, and serve best-matching search results in response to those queries.

While there is a query processing component in RankBrain, there also is a ranking component to it (when RankBrain was first announced, Google called it the third most important ranking factor). Presumably, RankBrain can somehow summarize what a page is about, evaluate the relevancy of search results, and teach itself to get even better at it with time.

The common understanding is that RankBrain, in part, relies on the traditional SEO factors (links, on-page optimization, etc.), but also looks at other factors that are query-specific. Then, it identifies the relevance features on the pages in the index, and arranges the results respectively in SERPs.

Hazards

  • Lack of query-specific relevance features
  • Poor user experience

How to stay safe with RankBrain

1. Maximize user experience. Of course, RankBrain isn't the reason to serve your visitors better. But it's a reason why not optimizing for user experience can get you down-ranked in SERPs.

Keep an eye on your pages' user experience factors in Google Analytics, particularly Bounce Rate and Session Duration. While there are no universally right values to stick by, here are the averages across various industries reported by KissMetrics (you can find the complete infographic here).

Bounce rates for different types of websites

If your bounces for some of the pages are significantly above these averages, those are the low-hanging fruit to work on. Consider A/B testing different versions of these pages to see which changes drive better results.

As for session duration, keep in mind that the average reading speed (for readers who skim) is 650 words per minute. Use this as guidance in assessing the amount of time visitors spend on your pages, and see if you can improve that by diversifying your content, such as including more images and videos. Additionally, examine the pages that have the best engagement metrics, and use takeaways in crafting your next piece of content.

2. Do competition research. One of the things RankBrain is believed to do is identify query-specific relevance features of web pages, and use those features as signals for ranking pages in SERPs. Such features can be literally anything on the page that can have a positive effect on user experience. To give you an example, pages with more content and more interactive elements may be more successful.

While there is no universal list of such features, you can get some clues of what they may be by analyzing the common traits of your top ranking competitors. Start SEO PowerSuite's Rank Tracker and go to Preferences > Competitors. Click Suggest, and enter your target keywords (you can — and should — make the list long, but make sure you only enter the terms that belong to one topic at a time). Rank Tracker will now look up all the terms you entered and come up with 30 sites that rank in Google's top 30 most often. When the search is complete, choose up to 10 of those to add to your project, examine their pages in-depth, and look for relevance features you may want to incorporate on your site.

Competitor research

Download SEO PowerSuite

6. Mobile Friendly Update

Launched: April 21, 2015
Rollouts:
Goal: Give mobile friendly pages a ranking boost in mobile SERPs, and de-rank pages that aren't optimized for mobile

Google's Mobile Friendly Update (aka Mobilegeddon) was meant to ensure that pages optimized for mobile devices rank at the top of mobile search, and subsequently, down-rank pages that were not mobile friendly. Desktop searches have not been affected by the update.

The Mobile-Friendly Update applied at page-level, meaning that one page of your site can be deemed mobile friendly and up-ranked, while the rest might fail the test. The Mobile-Friendly Update was one in a row of ranking algorithms that worked together in a bunch, and if a certain page ranked high thanks to its great content and user engagement, the Mobile-Friendly Update did not affect it a lot.

Hazards

  • Lack of a mobile version of the page
  • Improper viewport configuration
  • Illegible content
  • Plugin use

How to stay safe with Mobile-Friendly Update

1. Go mobile, cap. There are a few mobile website configurations to choose from, but Google's recommendation is responsive design. Google also has specific mobile how-tos for various website platforms to make going mobile easier for webmasters.

The Mobile-Friendly Update became the first in the series of Google algorithm changes towards mobile-first indexing. Starting from July 1, 2019, for all new sites, the search engine sends its mobilebot forward, so if you are launching a new site, be ready to meet it with mobile-friendly pages.

2. Take the mobile friendly test. Going mobile isn't all it takes — you must also pass Google's mobile friendliness criteria to get up-ranked in mobile SERPs. Google's mobile test is integrated into SEO PowerSuite's WebSite Auditor, so you can check your pages' mobile friendliness quickly.

Launch WebSite Auditor and open your project. Go to Content Analysis and click Add page to pick a page to be analyzed. Enter your target keywords and give the tool a moment to run a quick page audit. When the audit is complete, switch to Technical factors on the list of SEO factors on the left, and scroll down to the Page usability (Mobile) section.

The Mobile friendly factor will show you whether or not your page is considered mobile friendly overall; here, you also get a mobile preview of your page. The factors below will indicate whether your page meets all of Google's mobile friendliness criteria. Click on any factor with an Error or Warning status for specific how-to fix recommendations.

Mobile-friendly test
Check your website for Mobile Friendliness

 

Download SEO PowerSuite

5. Pigeon Update

Launched: July 24, 2014 (US)
Rollouts: December 22, 2014 (UK, Canada, Australia)
Goal: Provide high quality, relevant local search results

Google Pigeon (initially tried on English only) dramatically altered the results Google returns for queries in which the searcher's location plays a part. According to Google, Pigeon created closer ties between the local algorithm and the core search algorithm, meaning that the same SEO factors are now being used to rank local and non-local Google results. This update also uses location and distance as a key factor in ranking the results.

Pigeon led to a significant (at least 50%) decline in the number of queries local packs are returned for, gave a ranking boost to local directory sites, and connected Google Web search and Google Map search in a more cohesive way.

Hazards

  • Poorly optimized pages
  • Improper setup of a Google My Business page
  • NAP inconsistency
  • Lack of citations in local directories (if relevant)

How to stay safe with Pigeon

1. Optimize your pages properly. Pigeon brought in the same SEO criteria for local listings as for all other Google search results. That means local businesses now need to invest a lot of effort into on-page optimization. A good starting point is running an on-page analysis with SEO PowerSuite's WebSite Auditor. The tool's Content Analysis dashboard will give you a good idea about which aspects of on-page optimization you need to focus on (look for the factors with the Warning or Error statuses). Whenever you feel like you could use some inspo, switch to the Competitors tab to see how your top ranking competitors are handling any given part of on-page SEO.

Spy on your competitors' optimization efforts
Research your competitors' content to discover their optimization techniques

For a comprehensive guide to on-page optimization, check out the on-page section of SEO Workflow.

2. Set up a Google My Business page. Creating a Google My Business page for your local biz is the first step to being included in Google's local index. Your second step will be to verify your ownership of the listing; typically, this involves receiving a letter from Google with a pin number which you must enter to complete verification.

As you set up the page, make sure you categorize your business correctly — otherwise, your listing will not be displayed for relevant queries. Remember to use your local area code in the phone number; the area code should match the code traditionally associated with your location. The number of positive reviews can also have an influence on local search rankings, so you should encourage happy customers to review your place.

3. Make sure your NAP is consistent across your local listings. The Google search engine will be looking at the website you've linked to from your Google My Business page and cross-reference the name, address and phone number of your business. If all elements match, you're good to go.

If your business is also featured in local directories of any kind, make sure the business name, address, and phone number are also consistent across these listings. Different addresses listed for your business on Yelp and TripAdvisor, for instance, may put your local rankings to nowhere.

4. Get featured in relevant local directories. Local directories Yelp, TripAdvisor and the like, have seen a major ranking boost after Pigeon update. So while it may be harder for your site to rank within the top results now, it's a good idea to make sure you are featured in the business directories that will likely rank high. You can easily find quality directories and reach out to webmasters to request a feature with SEO PowerSuite's link building tool, LinkAssistant.

Launch LinkAssistant and open or create a project for your site. Click Look for prospects in the top left corner and pick Directories as your research method.

Do link-building for local SEO

Enter your keywords —  just as a hint, specify category keywords plus your location (e.g. "dentist Denver") — and give the tool a sec to find the relevant directories in your niche.

In a minute, you'll see a list of directories along with the webmasters' contact email addresses. Now, pick one of the directories you'd like to be included in, right-click it, and hit Send email to selected partner. Set up your email prefs, compose the message (or pick a ready-made email template), and send it off!

Download SEO PowerSuite

4. Hummingbird Update

Launched: August 22, 2013
Rollouts:
Goal: Produce more relevant search results by better understanding the meaning behind queries

Google Hummingbird is a major algorithm change that deals with interpreting search queries, (particularly longer, conversational searches) and providing search results that match searcher intent, rather than individual keywords within the query. The name of the Google algorithm was derived from comparing it to the accuracy and speed of the hummingbird.

While keywords within the query continue to be important, Hummingbird adds more strength to the meaning behind the query as a whole. The use of synonyms has also been optimized with Hummingbird; instead of listing results with the exact keyword match, the Google algorithm shows more theme-related results in the SERPs that do not necessarily have the keywords from the query in their content.

Hazards

  • Exact-match keyword targeting
  • Keyword stuffing

How to stay safe with Hummingbird

1. Expand your keyword research. With Hummingbird, you need to focus on related searches, synonyms and co-occurring terms to diversify your content, instead of relying solely on short-tail terms you'd get from Google Ads. Great sources of Hummingbird-friendly keyword ideas are Google Related searches, Google Autocomplete, and Google Trends. You'll find all of them incorporated into SEO PowerSuite's Rank Tracker.

To start expanding your list of target keywords, open Rank Tracker and create or open a project.  Enter the seed terms to base your research upon, and hit to search. In the next step, go to the Keyword Research module and check your ranking keywords first. Then, one by one, explore your potential keywords for ranking with all available keyword research methods. For example, select the Relates Searches method, add your topic keywords and hit Search.

Research more keywords
Use Keyword Research to find more content ideas and ranking opportunities

Hang on while Rank Tracker is pulling suggestions for you, and when it's done all the keyword ideas will be in the Sandbox, from where you can pick those that deserve to be added to your rank tracking list. Then go through the process again, this time selecting Google Autocomplete Tools as your research method. Do the same for Related questions. Next, proceed with analyzing the keywords' efficiency and difficulty, and pick the top terms to map them to landing pages.

2. Discover the language your audience uses. It's only logical that your website's copy should be speaking the same language as your audience, and Hummingbird is yet another reason to step up the semantic game. A great way to do this is by utilizing a social media listening tool (like Awario) to explore the mentions of your keywords (your brand name, competitors, industry terms, etc.) and see how your audience is talking about those things across social media and the Web at large.

3. Ditch exact-match, think concepts. Unnatural phrasing, especially in titles and meta descriptions, is still popular among websites, but with search engines' growing ability to process natural language, it can become a problem. If you are still using robot-like language on your pages for whatever reason, with the Hummingbird update (or, to be honest, four years before) is the time to stop.

Including keywords in your title and description still matters; but it's just as important that you sound like a human. As a nice side effect, improving your title and meta description is sure to increase the clicks your Google listing gets.

To play around with your titles and meta descriptions, use SEO PowerSuite's WebSite Auditor. Run the tool, create or open a project, and navigate to the Pages module. Go through your pages' titles and meta descriptions and spot the ones that look like they were created purely for search engine bots. When you spot a title you'd like to correct, right-click the page and hit Analyze page content. The tool will ask you to enter the keywords that you want your page to be optimized for. When the analysis is complete, go to Content Editor, switch to the Title & Meta tags tab, and rewrite your title and/or meta description. Right below, you'll see a preview of your Google snippet.

Test your meta descriptions how they will look on SERPs
Edit the title and meta description and get the snippet preview

 

Download SEO PowerSuite

 

3. Pirate Update

Launched: Aug 2012
Rollouts: Oct 2014
Goal: De-rank sites with copyright infringement reports

Google's Pirate Update was designed to prevent sites that have received numerous copyright infringement reports from ranking well in Google search. The majority of sites affected are relatively big and well-known websites that made pirated content (such as movies, music, or books) available to visitors for free, particularly torrent sites. That said, it still isn't in Google's power to follow through with the numerous new sites with pirated content that emerge literally every day.

Hazards

  • Pirated content
  • High volume of copyright infringement reports

How to stay safe with Pirate

Don't distribute anyone's content without the copyright owner's permission. Really, that's it.

2. Penguin Update

Launched: April 24, 2012
Rollouts: May 25, 2012; Oct 5, 2012; May 22, 2013; Oct 4, 2013; Oct 17, 2014; September 27, 2016; October 6, 2016; real-time since
Goal: De-rank sites with spammy, manipulative link profiles

Google Penguin update aimed to identify and down-rank sites with unnatural link profiles, deemed to be spamming the search results by using manipulative link tactics. Google rolled out Penguin updates once or twice a year until 2016, when Penguin became part of Google's core ranking algorithm. Now it operates in real time, which means that the algorithm penalties are applied faster, and recovery also takes less time.

Hazards

  • Low-quality links coming from "spammy" sites
  • Links coming from sites created purely for SEO link building (PBNs)
  • Links coming from topically irrelevant sites
  • Paid links
  • Links with overly optimized anchor text

How to stay safe with Penguin

1. Monitor link profile growth. Google isn't likely to penalize a site for one or two spammy links, but a sudden influx of toxic backlinks could be a problem. To avoid getting penalized by Google Penguin updates, look out for any unusual spikes in your link profile, and always look into the new links you acquire. By creating a project for your site in SEO PowerSuite's SEO SpyGlass, you'll instantly see progress graphs for both the number of links in your profile, and the number of referring domains. An unusual spike in either of those graphs is a reason enough to look into the links that your site suddenly gained in order to secure it from a Google penalty.

Look out for spikes in your site backlink profile
Find sharp drops or spikes in your backlink profile

2. Check for penalty risks. The stats that Penguin update likely looks at are incorporated into SEO SpyGlass and its Penalty Risk formula, so instead of looking at each individual factor separately, you can weigh them as a whole, pretty much like the Google algorithm does.

In your SEO SpyGlass project, switch to the Linking Domains dashboard and navigate to the Link Penalty Risks tab. Select all domains on the list, and click Update Link Penalty Risk. Give SEO SpyGlass a minute to evaluate all kinds of quality stats for each one of the domains. When the check is complete, examine the Penalty Risk column, and make sure to manually look into every domain with a Penalty Risk value over 50%.

Review the quality of your linking domains
Review backlinks with high Penalty Risk score

If you use SEO SpyGlass' free version, you'll get to analyze up to 1,000 links; if you're looking to audit more links, you'll need a Professional or Enterprise license.

3. Get rid of harmful links. Ideally, you should try to request removal of the spammy links in your Google profile by contacting the webmasters of the linking sites. But if you have a lot of harmful links to get rid of, or if you don't hear back from the webmasters, the only remedy you have is to disavow the low-quality links using Google's Disavow tool. This way, you'll be telling Google spider to ignore those links when evaluating your link profile. Disavow files can be tricky in terms of syntax and encoding, but SEO SpyGlass can automatically generate them for you in the right format.

In your SEO SpyGlass project, select the links you're about to disavow, right-click the selection, and hit Disavow backlinks. Select the disavow mode for your links (as a rule of thumb, you'd want to disavow entire domains rather than individual URLs). Once you've done that for all harmful links in your project, go to Preferences > Blacklist/Disavow backlinks, review your list, and hit Export to save the file to your hard drive. Finally, upload the disavow file you just created to Google's Disavow tool.

Disavow bad backlinks to prevent Penguin algorithm penalties
Create the disavow file and submit it to Google via the disavow tool.

Since it’s become a core ranking algorithm, one cannot say for sure that a Penguin update happens on this or that day. If you notice that your site authority or visibility has dropped, it can be a signal that the site has been penalized by Penguin. Likewise, one cannot say exactly when the full recovery takes place after all harmful links are removed. Positive effects should take place yet with the next Google spider crawl. However, you cannot expect the site authority to return straight to its previous level, since it’d been bloated with artificial links. You’ll need to apply some more accurate link-building tactics.

Note

Sometimes sites get hit by a manual Google penalty rather than a Penguin algorithm sanction. This happens when human reviewers at Google notice manipulative tactics on a site that have gone unnoticed by a Penguin update. You will receive a notice about a manual Google penalty under the Manual Actions tab in the Search Console. Your actions here follow a similar recovery path: go to remove or disavow the bad backlinks and ask Google to reconsider the manual action.

Download SEO PowerSuite

For a more detailed guide on conducting a Penguin-proof link audit, jump here.

 

1. Panda Update

Launched: Feb 24, 2011
Rollouts: ~monthly
Goal: De-rank sites with low-quality content

Google Panda is an algorithm used to assign a content quality score to webpages and down-rank sites with low-quality, spammy, or thin content. Initially, Panda update was a search filter rather than a part of Google's core algorithm, but in January 2016, it was officially incorporated into the ranking algorithm. While this doesn't mean that Panda is now applied to search results in real time, it does indicate that both getting filtered by and recovering from Google Panda now happens faster than before.

Hazards

  • Duplicate content
  • Plagiarism
  • Thin content
  • User-generated spam
  • Keyword stuffing
  • Poor user experience

How to stay safe with Panda

1. Check for duplicate content across your site. Internal duplicate content is one of the most common triggers for Google Panda update, so it's recommended that you run regular site audits to make sure no duplicate content issues are found. You can do it with SEO PowerSuite's Website Auditor (if you have a small site with under 500 resources, the free version should be enough; for bigger websites, you'll need a WebSite Auditor license).

To start the check for duplicate content, launch WebSite Auditor and create a project for your site. Hang in a moment until the app completes the crawl. When done, pay attention to the on-page section of SEO factors on the left, especially Duplicate titles and Duplicate meta descriptions. If any of those have an Error status, click on the problematic factor to see a full list of pages with duplicate titles/descriptions.

Remove duplicate content
Check duplicates in the On-page section

If for some reason you can't take down the duplicate pages penalized by Google Panda update, use a 301 redirect or canonical tag; alternatively, you can block the pages from Google spider indexing with robots.txt or the noindex meta tag.

2. Check for plagiarism. External duplicate content is another Panda trigger. If you suspect that some of your pages may be duplicated externally, it's a good idea to check them with Copyscape. Copyscape gives some of its data for free (for instance, it lets you compare two specific URLs), but for a comprehensive check, you may need a paid account.

Many industries (like online stores with thousands of product pages) cannot always have 100% unique content. If you run an e-commerce site, try to use original images where you can, and encourage user reviews to make your product descriptions stand out from the crowd.

3. Identify thin content. Thin content is a bit of a vague term, but it's generally used to describe an inadequate amount of unique content on a page. Thin pages are made up of duplicate content, scraped or generated automatically. Typically, thin content appears on pages with low word count that are filled with ads, affiliate links, etc., and provide little original value. If you feel that your site could be penalized by Google Panda update for thin content, try to measure it in terms of word count and the number of outgoing links on the page.

To check for thin content, navigate to the Pages module in your WebSite Auditor project. Locate the Word count column (if it's not there, right-click the header of any column to enter the workspace editing mode, and add the Word count column to your active columns). Next, sort the pages by their word count by clicking on the column's header to instantly spot the ones with very little content that risk being hit by Panda.

Thin content is a signal for Google Panda algorithm
Identify thin content in Pages module

Next, switch to the Links tab and find the External links column, showing the number of outgoing external links on the page. You can sort your pages by this column as well by clicking on its header. You may also want to add the Word count column to this workspace to see the correlation between outgoing links and word count on each of your pages. Watch out for pages with little content and a substantial number of outgoing links.

Spammy outbound links are a strong signal to Panda update

Mind that a "desirable" word count on any page is tied to the purpose of the page and the keywords that page is targeting. E.g. for queries that imply the searcher is looking for quick information ("what's the capital of Nigeria", "gas stations in Las Vegas"), pages with a hundred words of content can do exceptionally well on Google. The same goes for searchers looking for videos or pictures. But if those are not the queries you're targeting, too many thin content pages (<250 words) may get you in trouble.

As for outgoing links, Google recommends keeping the total number of links on every page under 100 as a rule of thumb. So if you spot a page with under 250 words of content and over 100 links, that's a pretty solid indicator of a thin content page.

4. Audit your site for keyword stuffing. Keyword stuffing is a term used to describe over-optimization of a given page element for a keyword. When all optimization efforts fail and your pages do not reach those desired top 10, it can be indicative of keyword stuffing issues and Google Panda update sanctions. To figure out if this is the case, look at your top ranking competitors' pages (that's exactly what SEO PowerSuite's WebSite Auditor uses in its Keyword Stuffing formula, in addition to the general SEO best practices).

In your WebSite Auditor project, go to Content Analysis, and add the page you'd like to analyze for Panda-related issues. Enter the keywords you're targeting with this page, and let the tool run a quick audit. When the audit is complete, pay attention to Keywords in title, Keywords in meta description, Keywords in body, and Keywords in H1. Click through these factors one by one, and have a look at the Keyword stuffing column. You'll see a Yes value here if you're overusing your keywords in any of these page elements. To see how your top competitors are using keywords, switch to the Competitors tab.

Check keyword stuffing in titles
Do on-page SEO optimization in the Page Audit module

5. Fix the problems you find. Once you've identified the Panda-prone vulnerabilities, try to fix them as soon as you can to prevent being hit by the next Panda algorithm iteration (or to recover quickly if you've been penalized). For a small number of pages with duplicate content, use Website Auditor’s Content Editor to edit those pages right inside the module and see how the quality factor improves on the go. Go to Content Analysis > Content Editor and edit your content in either a WYSIWYG editor or HTML. Here you can play around with your titles and meta description in a user-friendly editor with a Google snippet preview. On the left, the on-page factors will recalculate as you type. Once you've made the necessary changes, hit the Download PDF button or the down-error to export the document as an HTML file to your hard drive.

Optimize your pages in Content Editor
Optimize on the go in the Content Editor module

 

Download SEO PowerSuite

If you're looking for more detailed instructions, jump to this 6-step guide conducting a content audit for Panda update issues.

And that is not the end...

Those are the major Google algorithm updates to date, along with some quick auditing and prevention tips to help your site stay afloat (and, with any luck, keep growing) in Google search.

I skipped some well-known Google algorithm updates that affected rankings one-time, and hardly any recovery could be invented. For example, in 2012, Google rolled out a filter to downgrade exact-match domains without high-quality content. Such domains proliferated for a short time when exact match for keywords was enough to rank up, a part of manipulative ranking tactics. Nothing to say about the Intrusive Interstitials back to 2015, as the Mobile-Friendly Update has made all those mobile tricks even more complicated. 

So our list of Google algorithm updates is very far from exhaustive, it rather focuses on those workings of the search engine that can be uncovered with the help of SEO tools.

As always, we're looking forward to your comments and questions below. Have any of these updates had an impact on your ranks? If so, what was the tactic that helped you recover? Please share your experience in the comments!