15 Major Google Penalties and How to Recover from Them

Google penalties are extremely unpleasant for website owners, especially those who rely on organic search for income. This guide summarizes all major reasons for algorithmic and manual Google penalties, gives recovery tips, and prevention measures.

What are Google’s penalties?

Google penalties are sanctions that Google imposes on websites for bad SEO practices. They lead to lower organic rankings and online visibility or even total removal of a site’s pages from the search results (or SERPs). Google penalties can be applied at the level of a separate query (a bunch of queries), a separate URL (several URLs, entire directory, subdomain, etc.), or sitewide.

Google penalties can be sorted into the following types:

  • Manual penalties – imposed by Google employees and require a fix and reconsideration request.
  • Algorithmic penalties – require a fix that will lead to an improvement automatically when Googlebot revisits your pages; the most notorious algorithmic penalties were happening due to Penguin, Panda, Intrusive Interstitials, and Page Experience updates.

By the scope of impact, Google penalties are split into:

  • Downranking – such penalties result in ranking drops and website traffic loss for all queries on the affected pages.
  • Delisting – all URLs are removed from the Google index, which you can check by running a site: search; there will be none of your URLs in the search results.

Google watches after search quality with the help of algorithms as well as human quality raters. Besides, there is a tool for anyone to report malicious content, phishing, paid links, or spammy content.

If you happen to do any black-hat or gray-hat SEO and get a penalty, it is urgent to fix the issue. Even if only a part of your pages have been affected, the overall authority of the domain will decrease, and it will struggle to rank for the rest of the queries.

However, even after you remove the content or links which led to a sanction from Google, this does not guarantee that all your rankings will be restored to the level before the penalty. So, it is better safe than sorry – check out why Google penalties may occur and prevent them.

How do you know if your site is penalized by Google?

Detecting manual penalties is dead simple: check the Manual actions box in Google Search Console. On a healthy website, you will see a statement saying everything is fine. To find out if there had been any penal notices in the past, you can review the messages under the bell icon in the top right corner.

Bad luck, and you’ll see the red penal notice stating what issues have been detected.

As is often the case with manual actions, SEOs know who to blame. The problem can occur because of paid links, keyword stuffing, auto-generated content – anything that has been added to pages on purpose to manipulate rankings.

To detect partial or algorithmic penalties (for which no notifications are handed down), you need to observe rankings regularly. Let’s see how to set up regular tracking to observe anomalies in Rank Tracker (note that you will need a Professional or Enterprise license).

This step requires Rank Tracker. You can download it now for free.
Download Rank Tracker

Step 1. Launch the tool and add your site’s URL to create a project.

Step 2. Add your target keywords to Rank Tracking and map the ranking URLs as landing pages.

Step 3. Turn on Recording the SERP history to record the top 30 results with each ranking check.

Step 4. Additionally, you can track the rankings of your major competitors for the target keywords. This way, you can quickly see if the ranking impact is industry-wise or occurs only on your website.

Step 5. Add an automatic task to check rankings regularly. The tool will stay in standby mode and will check ranks on autopilot on the set date.

Rank tracking will give you a sufficient set of data to diagnose Google penalties. 

In the tracking tool, examine if there is a sharp drop in rankings for your target keywords.

If yes, compare those with your closest competitors to check if they are experiencing the same.

Also, check out the Fluctuation Graph under SERP details. When you notice high volatility in search results, most likely, Google is updating something in its algos.

Unless it is a massive Google update, audit your site to spot probable technical issues that might be affecting its performance.

How long will it take to recover from a Google penalty?

Recovering from a manual penalty depends on each particular case, its difficulty and types of fixes to do. It may take from a couple of weeks to several months for a site to recover.

The penal notice in the Console usually gives a hint about what the issues are and in which way they violate Google’s Search Essentials (formerly Webmaster Guidelines). However, SEO specialists need time first to investigate them in detail.

Next, they submit a request to review the sanctions in the Search Console. And it will take time for Google to consider the request, so it is better to make it simple. A good request:

  • Explains the issues precisely
  • Describes the steps you’ve taken to fix them
  • Shows the outcomes of your efforts

Then, the Console will show the review status message to let know whether your request is considered and the penalty is revoked.

Most requests submitted to Google are either approved or rejected. Sometimes, the message has it that the request is being processed, and the penalty remains for some other violations that have not been fixed yet.

With algorithmic penalties, things are less clear because it is sometimes hard to identify when a specific penalty is in action. You will first need to audit your site and content to find out the causes.

Google updates often cause ranking issues. When your rankings don’t ping back in a short while after an update, probably, it is an algorithmic penalty being applied for a certain issue on the website.

As John Mueller said in Google SEO office hours:

“It’s generally a good idea to clean up low-quality content or spammy content that you may have created in the past. For algorithmic actions, it can take us several months to reevaluate your site again to determine that it’s no longer spammy.”

John Mueller

According to Google, if a site has lost rankings after a core update, this might be linked to E-E-A-T issues. That is, the website generally lacks authority, expertise, trust, or experience. And it may take months to establish these signals.

Google penalties and recovery tips

Let’s take a closer look at what bad practices may end up with a Google penalty and why you should avoid them.

1. Keyword stuffing

Keyword stuffing is an outdated SEO tactic of inserting a lot of keywords to rank a page artificially. Stuffed key phrases often appear on a page unnaturally or out of context, sometimes in the form of listings, and this is a confirmed negative SEO ranking factor.

Example of keyword stuffing as detected by Content Editor
Example of keyword stuffing as detected by Content Editor

Keyword stuffing often goes together with deceptive SEO techniques, such as cloaking, content spinning, and so on, which altogether are a surefire way to sanctions.

The fix

The Panda algorithm penalizes over-optimized content, and instead of ranking a stuffed page up, it will do just the opposite. When you know there are no black-hat techniques implemented on your pages, then carefully review your content. You can get help from Content Editor to create properly optimized pages.

Download WebSite Auditor
  • Use the target keyword only once in the title, H1 tags, and meta description tags.
  • Use Related questions, People Also Ask, and Autocomplete tools to find more ideas and expand your article.
  • Distribute your keywords evenly across the text and fit them into the context.
  • Use synonyms, long-tail keywords, and rephrased terms to replace the exact-match keywords.
  • Format your text properly with headings and paragraphs; remove excessive use of bold, italics, etc.

2. Thin content

Thin content is the one that gives little or no value to the user. It is not a problem to have a couple of poor-quality pages. But if a site has a lot of pages that are not useful for visitors, algorithms will affect their rankings.

The reasons for thin content can be different. For example, a site is hastily generated with the help of automatic tools. One more example is an e-commerce website with thin product pages that lack item descriptions. Affiliate websites or blogs may also appear to be thin content when they lack E-E-A-T signals. 

The fix

First, identify pages with thin content. The easiest way is to find pages with a low word count which may mean that the pages give little value to users.

In WebSite Auditor, in the Site Audit module, filter all Pages by Word count and examine those with the fewest number of words. Actually, there is no any strict range for an ideal word count: it depends on the type of page and the goal for which it has been created.

Download WebSite Auditor

Second, find poorly written blog posts that don’t bring any value. For instance, in Google Analytics, look for pages that bring no organic traffic or get the highest bounce rates.

Rewrite the affected pages to create an in-depth post meeting the search intent of your visitors. Follow the optimization advice from the Content Audit section.

On-page content audit gives optimization tips based on the comparison of top 10 ranking pages
Content audit with on-page tips from the best practices and the comparison of top 10 ranking pages

You can also use Content Editor here to find popular related topics and frequently asked questions that add more information to your topic. Among other things, this tool shows the minimal and maximum word count on the best ranking pages for your target query.

3. Thin affiliate websites

The thing about affiliate websites is that sometimes they are just copies of the merchant website, adding no value to users. The search engine, finding similar templated websites, clusters them together and ranks the main one out of the bunch.

The fix

If you are doing affiliate marketing, stick to the best practices:

  • Provide fully-fledged descriptions of the products and describe your unique experience to cover the subject in depth.
  • Add pros and cons (and, by the way, they can be implemented now with the structured data).
  • Add high-quality screenshots and videos supplementing your information.
  • Develop topical authority, i.e., cover more relevant topics to develop your expertise.
  • Use Rank Tracker to find more high-volume related keywords and popular questions that people ask together with your main query.
  • Build quality links from trusted sources.

4. Duplicate content

It may come as a surprise, but there is no penalty for duplicate content as such, "at least, not in the way most people mean when they say that", says Google.

The whole problem with duplications is about wasting your site crawl budget. Search bots may get confused about which URL to present to the user. This way, unnecessary pages appear in searches, visitors get upset and, consequently, your website rankings and traffic are damaged.

Things are rather simple with external duplications. Google penalizes scraped or syndicated content identical to the whole original website.

"The only time we would have something like a penalty or an algorithmic action or manual action is when the whole website is purely duplicated content …if it’s one website that is scraping other websites for example.”

John Mueller

Thus, to prevent duplication penalties, watch out for cases when someone creates a copy of your website somewhere on the web. For the rest, keep your site without duplicates following best practices.

The fix

For sitewide duplication prevention that will not cause a manual penalty but may impact your rankings, make sure that:

  • The canonical domain version is set up (www or non-www versions). You can check the setup in the Site Audit module in Website Auditor.
  • Your CMS should not generate duplicates automatically; consult Site Audit to spot duplicate URLs and get rid of them if there are any.
  • Implement language-region variants of your pages with the help of the hreflang generator tool in WebSite Auditor.
  • Take care of your pagination and internal search pages (here is a post about pagination best practices).
  • For e-commerce sites, implement a proper site structure and optimize product pages.
  • In case of external near-duplicate content issues, website owners should work to improve the E-E-A-T signals of their websites and increase brand awareness.
  • In case of piracy, you can file a copyright infringement report to Google.

5. Spammy auto-generated content

Spammy auto-generated content is content that has been programmatically generated and lacks coherence or adds little value for users. Such content is created with the sole purpose of manipulating search rankings, no wonder Google has put much effort into eliminating spam from SERPs.

Examples of spammy auto-generated content that may get penalized include:

  • Text generated from scraping feeds or search results
  • Gibberish containing search keywords distributed sporadically
  • Automatically translated text without human review
  • Automatically generated text disregarding the quality or user experience
  • Text generated by automated synonymizing, paraphrasing, or obfuscation
  • Stitching or merging content from different web pages without adding value

The story with automatically generated content has got an interesting twist after the arrival of the GPT-3 technology and Open AI. This tool can generate awesome texts looking pretty close to human-written content. But does it ensure that such auto-generated content will not be treated as spam?

There are two problems here. First, it is doubtful so far that we can rely on the accuracy of AI-generated content. The question is whether website editors who resort to AI content generation will pay due attention to verification (especially this concerns recent facts, emerging trends, news, or anything about what AI does not “know” yet for sure).

And second, with AI tools, content quantity might grow exponentially. So, it will require much more capacity to process it, and Google is to face the challenge.

The fix

If you have a lot of auto-generated content that has resulted in a Google penalty, there are several steps to handle it:

  • Restrict such pages from indexing by adding the noindex tag and remove them from your sitemap.
  • Help Google focus on high-quality pages: prune your content to merge and remove low-quality pages.
  • Edit and proofread your copy to ensure good quality. Use content writing assistants, such as Grammarly for spellchecking and Content Editor for on-page optimization.
  • Keep the real visitors in mind and write for people. Also, stick to natural style of writing and your distinct tone to help users get used to your brand voice.

6. User-generated spam

Usually, user spam appears in comments under blog posts, on forums, in popular social media accounts, etc. Such comments are often generated by automatic tools only to acquire SEO backlinks.

User-generated spam harms a site’s quality because it dilutes PageRank and is often irrelevant to the main content. Too many such comments sitewide may lead the site straight under a Google penalty.

The fix

In case you detect unnatural comments on your pages, simply delete them. But it’s easier to prevent spam rather than clean it up later. Some platforms even prefer to close down comment sections if they lack the resources to monitor them.

Yet, if you need comments on your website, there are a few best practices to prevent user-generated spam:

  • Use comment moderation platforms like Disqus or plugins like Akismet for WordPress to filter spam; besides, most CMSs have in-built comment approval features that are actionable for smaller websites.
  • Restrict negative words on social media in your account preferences.
  • Add nofollow or UGC (user-generated content) tags on your comments, since these tags make link-building for the sake of SEO meaningless.
  • Use authentication and CAPTCHA for users to add content on your platforms.
  • Stipulate user content policies: for large platforms, clear policies are necessary for consistent moderation. Besides, when users are aware of your policies, they will less likely abuse the platform.

7. Doorways

Doorway (sometimes they are called jump pages or bridge pages) is a spamdexing technique in which intermediate pages are optimized to rank for specific similar queries and are not as useful as the final destination. In other words, these are doors that take up a lot of SERP estate and funnel users to one website.

From this description, truly harmful doorways meet the following criteria:

  • Multiple domains or pages are ranking with the help of copied content and keyword stuffing.
  • The pages give little value and serve as an intermediary redirecting visitors to another page.
  • The pages were created for search engines, so the user experience is ignored; such URLs are isolated and cannot be accessed from the website navigation.

Doorways disrupt the search experience and mislead users, so search engines try to spot and penalize them.

The fix

For intentional doorways, the only advice is not to do it. So, if you created doorway pages and got hit by a penalty, remove any of those with noindex tags and remove them from the sitemap. Instead, create valuable content that meets searchers’ intent and apply all the best SEO practices to rank high.

Also, audit your redirects to make sure that they lead to the right destination. You can quickly get a list of all your 301/302 redirects in the Site Audit section in WebSite Auditor.

8. Cloaking

Cloaking is another black-hat SEO technique in which users and search engine bots see different content on the same page. Simply put, a page ranks for one set of keywords, which is easier to rank for, but shows something else.

An old-school cloaking method is to hide keywords or links on a page with the help of color, size, CSS styles, etc. Some more sophisticated forms of cloaking are implemented by identifying the user agent, the IP, the referrer, etc., to serve different webpage versions to a human and Googlebot.

Since cloaking violates Webmaster Guidelines, it might lead to a manual penalty. Occasionally, cloaked pages still appear in searches. How come? 

Back in 2018 and now in 2023, Google reconfirmed it was able to recognize invisible text and ignore it. According to John Mueller, if a page with hidden text ranks, this might be for other reasons.

And experiments show that hidden text would not boost rankings for those hidden words, so it is simply useless.

There are cases when serving a slightly different version of content is appropriate and is not considered as cloaking. For example, paywalls, in essence, show different stuff to users and search engines. Google provides flexible sampling guidelines for paywalled content and supports structured data to differentiate it from cloaking.

The fix

First of all, in Search Console, you can examine your page with the URL Inspection tool to check how Googlebot sees it.

Alternatively, use WebSite Auditor to analyze your pages and detect invisible elements. In the advanced Project Settings, you can pick the Googlebot or Googlebot-Mobile crawler to examine the contents of your pages. Probably, it is better to ask your technician to find and delete inappropriate scripts.

Download WebSite Auditor

9. Unnatural links

For over a decade, Google has been finding ways to penalize manipulative link practices – that is how Penguin and SpamBrain algorithms appeared. Yet, here and there, we hear someone saying that links do work, including paid links.

Google tells us that it can differentiate and ignore bad links. Yet, link spam updates as well as the potential of earning a manual action serve as a good reminder: webmasters need to watch out for the quality of their websites' link profiles.

Here is a list of signals implying that a site might be involved in manipulative link-building practices:

  • Too many backlinks appear rapidly (which might mean the links are artificial).
  • Many unnatural links from irrelevant domains, paid blog networks (PBNs), or link farms.
  • Too many outbound links from one page.
  • Keyword stuffed anchor texts.

The fix

You can use SEO SpyGlass to assess backlinks pointing to a website and evaluate the potential risk of getting a penalty. Besides, you can integrate other link sources, including Search Console.

  Download SEO SpyGlass

The backlink checker evaluates the quality of a link and calculates the Penalty Risk score that considers factors like domain age, anchor text, the number and quality of incoming backlinks, sitewide links, Page/Domain Authority, IP diversity, etc.

Penalty Risk dashboard in SEO SpyGlass
Penalty Risk estimation on backlinks, Details will provide more information on each risk factor

So, to fix a penalty for suspicious links, do the following:

  • Identify harmful backlinks; contact webmasters and ask them to take down the bad links; unless they do so, disavow spammy backlinks with the Google Disavow tool.
  • Do quality link-building using proven white-hat tactics (for example, broken link-building, link gap, skyscraper, and more).
  • Diversify your link anchors.
  •  Set up Alerts in SEO SpyGlass to get a warning if the site gets a sharp influx of backlinks.

10. Selling links

The search engine looks after both incoming and outgoing links. Citing reliable sources can be an additional signal of trust for people. Meanwhile, excessive outbound links (especially from the same directory to irrelevant websites) may mean that the website is selling links.

The fix

Audit your links in All Pages > In Link Rank tab in Website Auditor. InLink Rank is a metric that estimates the importance of a page based on the number and quality of links, both incoming and outgoing. URLs with a low InLink Rank might mean there can be an issue with links on the page.

You can select each URL (especially those with the largest number of links) and examine Links from page in the lower half of the workspace. There is a quick filter to sort all External links.

Download WebSite Auditor

If you’ve found suspicious outgoing links which are irrelevant to content, too numerous, etc., tag them as nofollow or remove them.

Mind that there is nothing wrong with having paid links, but in this case, you should mark them as sponsored.

11. Incorrect structured data

Structured data is a powerful tool used to enhance results and grasp more estate on SERPs. Yet, mistakes in structured data, purposeful or occasional ones, may lead to manual penalties. 

Google documentation states that a structured data manual action means that a page loses eligibility for appearance as a rich result. However, Google has recently clarified that a manual action involving structured data might also impact rankings.

It is not always easy to pinpoint schema mistakes because even if the Structured Data testing tool validated your markup, it does not mean that everything is fine. 

So, the typical markup errors that may end up with penalties include the following:

  • The type of structured data does not match the content on the page, e.g., a company is marked as a product, non-recipe marked as a recipe, etc.
  • Structured data refers to hidden elements.
  • There are errors in implementation, e.g. some mandatory fields are missing, or there are typos that distort information.
  • A page-specific markup has been used sitewide, e.g. Organization schema is used with reviews on pages other than the homepage.
  • A page violates review guidelines, e.g. ratings are not sourced by real customers; there are errors in BestRating, WorstRating, and RatingValue; a category rating is shown on an individual product, and the like.
  • Marked-up job postings represent misleading information, etc.

The fix

Search Console shows markup errors in the Enhancements section. These errors can impact your SERP appearance and lead to losing rich features and tons of traffic, so they need a fix.

Each type of structured data has its own technical and search guidelines, so make sure to read them carefully before implementing the markup. And you can consult our Schema markup guide for more details.

In case you’ve faced a penalty, review your markup and fix the issues. WebSite Auditor will help you collect the list of all pages with a markup on them. Go to Site Structure > Pages and see the Open graph & structured data tab.

Download WebSite Auditor

12. Intrusive pop-ups

The first penalties for intrusive ads appeared with the Page Layout algorithm update in 2012. The algorithm penalized websites for excessive static ads above the fold. Later on, the Intrusive Interstitials algorithm added popups and overlay ads to the list of no-go design practices.

The penalties are applied algorithmically once intrusive interstitials appear on a page. Generally, you will notice a dip in both impressions and clicks – and a rollback after you remove the interfering element.

Note

At a broader scale, Core Web Vitals metrics were introduced in 2021 to assess the whole page experience: not only particular web page elements, but also load speed, the size of pages, and user experience all matter to better rankings. You can read the case study about how we improved our Core Web Vitals.

The fix

First and foremost, avoid intrusive interstitials on your site. Stick to the best practices in your page design, both for desktop and mobile devices. For example, delay pop-ups and other interaction buttons until the visitor decides to leave.

Also, check your Page Experience report in Google’s Console (all your pages must be Good URLs, ideally). And use WebSite Auditor to audit all pages in bulk and ensure that each page gets the highest page Core Web Vitals scores possible.

13. Misleading or improper content

A large chunk of News and Discover content may get penalized for misinformation and policy violations. This mostly refers to sensitive YMYL topics, which require more evidence and accuracy, such as news websites, healthcare, finances, etc.

The fix

The only way to fix misleading content issues is to remove what caused the issue and request reconsideration.

To prevent penalties on your YMYL sites, make sure you do not publish:

  • Adult content that involves sexually explicit content, nudity, and anything aimed to cause arousal; the rule covers not only News and Discover, but organic search in general.
  • Hate speech, terrorism, violence, and all that is clear why should be restricted.
  • Harassment, bullying, threats, and exposure of personal information.
  • Medical content that might contradict scientific knowledge.
  • Misleading or manipulated content that lacks bylines, dates, sources, and even authorship.
  • Artificial freshening of publication dates on YMYL pages, as it can also result in penalties.

14. Spammy free hosting

There also might appear penal notices stating that a site shares spammy free hosting. Even if this particular website does nothing wrong, it may get a penal notice saying that the free hosting is being abused by third-party spammers.

As a result, the website will lose visibility, rankings, and search features. However, it will still stay indexed.

The fix

Opt for secure hosting. Here, we’ve got a brief roundup of all aspects of choosing reliable hosting.

15. Hacked content

Hacked content means that some content or code has been added to a site without the knowledge of the owner because of breaches in the website’s security. As a result, the site may contain hidden links, sneaky redirects, deceptive pages, etc.

Hacked websites are used in different forms of cybercrime. The problem is that such breaches harm not only the hacked website but also its visitors.

Websites with the hacked content penalty are not delisted on Google, but they have a warning about a potential threat next to the URL in the SERP. After clicking on the URL, the user will see a full-screen warning saying that the site is potentially harmful.

Warning message that the site might be hacked
A warning message in search results about a hacked website

Google sends a message to the site owner about the hacked content along with the URL affected. Unlike with all other manual penalties, the warning appears in the Security issues tab.

Besides, anyone can check a site's status regarding unsafe content in Google’s Transparency Report. Remarkably, the overall report shows that the number of search security warnings has drastically dropped as compared to five years ago.

 
Google Transparency Report about total search warnings
Total search warnings in Google Transparency Report

The fix

If your site has got a security issue alert, first, you will need to scan the website to detect the vulnerability and eliminate it.

Next, to level up your website’s security, consider the following tips:

  • Scan your website with malware scanners to spot the issue.
  • Set up website security apps or plugins and run regular security checks.
  • Add an SSL certificate to secure the connection.
  • Find and fix mixed content issues (files with http protocol in https URLs).
  • Regularly back up your website data.
  • Protect your site from user spam.
  • Keep your CMS and plugins timely updated.
  • Hide login URLs, require strong passwords, and add two-step verification.

What was the toughest Google penalty you’ve ever fixed?

This list of Google penalties was not meant to be exhaustive. It rather shows different kinds of issues that may cause a sanction and how to fix them all. So, if you happen to get a penalty, you’ll know what to do (but I hope you won’t get any because white-hat SEO is the best, right?)

What was the hardest Google penalty you’ve ever recovered from? We’d love to hear from you in our user groups on Facebook and Twitter.

Article stats:
Linking websites N/A
Backlinks N/A
InLink Rank N/A
Data from Seo SpyGlass: try free backlink checker.
Got questions or comments?
Join our community on Facebook!