How to Create Listicles That Won't Tank Your Rankings: 10-Step Process

The SEO tactic everybody talked about in 2025 has become a major reason for organic traffic drops in early 2026. According to SEO consultant Lily Ray, multiple major brands lost between 29% and 49% of their Google visibility in mid-January and the common thread was self-promotional listicles. 

Does this mean you should immediately remove all listicles from your website to avoid getting hit? Not necessarily. While self-promotional listicles at scale clearly triggered Google's quality filters, listicles themselves aren't the problem — it's how they're created and used.

In this guide, I've broken down what actually happened to the sites that lost visibility and outlined a 10-step process to help you audit your existing listicles and create better content going forward. Let's dive in.

What happened to listicles in 2026?

In early February 2026, Lily Ray noticed something unusual: multiple major B2B and SaaS brands experienced sudden, severe visibility drops in Google — all around the same time and all following a similar pattern.

The numbers were brutal:

  • Sites lost between 29% and 49% of organic visibility in just two weeks
  • One $8B B2B brand dropped 49% between January 21 and February 2
  • Damage was concentrated in blog/resource sections (typically 70-90% of total site visibility)
  • All affected sites were B2B or SaaS companies

When Lily investigated what these sites had in common, she found a clear pattern: they had all published self-promotional listicles at scale. The problem wasn't just about creating listicles — it was the approach. These articles presented themselves as objective, independent evaluations by using "best" in their titles, but then ranked the publisher's own product at #1 without transparent disclosure or evidence of actually testing competitors. 

Some other problematic tactics on the analyzed websites included the following:

  • AI-generated content (100% detection scores when tested)
  • Artificial freshening (adding "2026" to titles without substantial updates)
  • Schema markup violations (misusing AggregateRating)
  • Programmatic templates scaled across hundreds of pages
  • Excessive informational/definition-based content
  • Rapid content scaling with minimal editorial oversight

The impact extended beyond traditional search results. SEO consultant Glenn Gabe tracked several affected sites and found that drops in organic visibility directly correlated with drops in Google's AI Overviews. This makes sense since AI Overviews pull from Google's core search results - if a site loses visibility in one, it loses visibility in the other.

The key takeaway isn't that listicles are dead or that you should never include your own product in comparison content. The issue was treating this as a volume play: creating hundreds of self-promotional listicles, using AI to scale them with minimal oversight, and making them the biggest part of the content strategy. 

I’ve put together a simple checklist with tips and workflows to help you improve your blog and listicle strategy.

texture
texture
 
 

Step 1: Audit your existing listicle footprint

Before you add any new listicles to your site, it's worth taking a look at how many you already have.

The quickest way to check is with a simple site search:

site:yourdomain.com intitle:best "1. YourBrand"

Or if your listicles live in a subfolder:

site:yourdomain.com/blog/ "best" "1. YourBrand"

This surfaces pages where you're both ranking for a "best X" keyword and featuring yourself in the list, which is the pattern Google has been scrutinizing most.

Once you have a number, do a quick gut check. Under 10 is low risk. Somewhere between 10 and 25 is worth keeping an eye on. Above 25 and you're in territory where the balance starts to tip. Sites with 100+ self-promotional listicles were among the hardest hit — some had anywhere from 61 to 340 of these pages indexed.

For a more precise picture, divide your self-promo listicle count by your total indexed pages and multiply by 100. If that number is above 5–10%, it's worth pausing on new listicles until you've audited what's already there.

Step 2: Identify topically relevant listicle opportunities

Not every "best X" keyword is worth your time, even when the search volume looks good. Before you start building a list of tools or products, it's worth asking a simple question: does this topic actually belong on your site?

A project management SaaS company writing about "best project management software" is a natural fit. The same site writing about "best productivity apps" might be a bit of a stretch. The further you drift from what your site is actually about, the harder it is to rank — and the less useful that traffic tends to be when you do.

Here's how to do a quick relevance check in RankDots:

  1. Open RankDots and input your main seed keywords
  2. Navigate to the keyword clustering view
  3. Look for clusters that include "best [your topic]" variations
  4. Check if your listicle idea appears (or is close to) in the found ideas

Example: If you run an email marketing website, you can get such listicle ideas as "best newsletter platforms," “best email automation solutions" etc.

You can also use Rank Tracker's keyword research module to identify opportunities:

  1. Open Rank Tracker and create a project for your domain.
  2. Go to Keyword Research > Keyword Gap.
  3. Enter your top competitors’ URLs.
  4. Select ‘Competitor keywords (any competitor but not your site) and click Search.
  5. Filter for keywords containing "best" or "top".

Pick the topics and keywords with the highest traffic potential and let’s move on to the next step.

Step 3: Choose your listicle type

Here's something most SEO guides skip: not all listicles are created equal.

The lowest-risk option is a roundup or curation piece, where you're simply compiling the best tools, examples, or resources in your space, no self-promotion involved. These also tend to attract links naturally, since you're doing the legwork of curating for other people.

Topic-adjacent listicles follow the same logic. Instead of "best project management software" (where you'd be tempted to include yourself), you write "best free project management templates" or "best Gantt chart examples." Still relevant to your audience, still good for topical coverage, but without the conflict of interest.

Then there's the transparent comparison — where you're on the list. This is where most brands get into trouble, not because the format is wrong, but because they skip the "transparent" part. If you've genuinely tested every tool you mention and you're upfront about being one of the options, this can work really well. If you haven't? Readers clock it immediately, and so does Google.

When in doubt, default to the roundup. It's easier to write well and harder to get wrong.

Step 4: Research and test (actually do it)

This is the step most listicle writers skip and it's exactly why most listicles feel hollow.

Real research means using the product. Sign up and spend some time actually running it through the use cases your audience cares about. Document the actual pricing, including the stuff buried in the fine print. Note what breaks, what's slow, what support is like when something goes wrong.

As you go, keep a simple spreadsheet with a row for each tool: name, pricing, pros, cons, who it's best for, and a column for screenshots.

One shortcut worth knowing: if you genuinely don't have time to test five or ten products properly, don't write that listicle. Write a topic-adjacent one instead — "best books on project management" or "best YouTube channels for PMs" — where the research bar is more manageable.

Step 5: Structure your content with quality signals

Google's Helpful Content and review systems are specifically looking for evidence that you've done real evaluation, not just assembled a list. The good news is that showing this doesn't require anything fancy. It mostly comes down to being explicit about how you worked and consistent in how you present each option.

Start with a methodology section at the top of the article. This is the part most listicles skip, and it's probably the single biggest signal of genuine effort. Keep it short — a paragraph or two — but cover the basics:

  • How you tested each option and for how long
  • What criteria you used to evaluate them
  • Any relevant background or expertise you're drawing from

It doesn't need to be formal. It just needs to exist, and it needs to be specific. "We tested each tool for two weeks across three use cases" is more convincing than "we did extensive research."

From there, make sure you're evaluating every option against the same criteria. Readers notice when one tool gets a pricing breakdown and another doesn't, or when limitations only appear for some entries.

Something like this works well:

  • Feature set
  • Pricing tiers
  • Ease of use
  • Ideal use case
  • Limitations

Then, for each individual entry, follow a repeatable structure. It doesn't have to be rigid, but having a template keeps the piece scannable and signals thoroughness:

Product Name

Best for: [Specific use case]

Pricing: [Actual pricing with link]

[Screenshot of actual product interface]

What we liked:

What could be better:

Bottom line: [2-3 sentences]

The "what could be better" section matters more than people think. A listicle where every tool gets glowing coverage with no real drawbacks reads like sponsored content — even if it isn't. One honest limitation per entry does more for your credibility than five things you liked.

Step 6: Handle self-inclusion transparently (if applicable)

If your own product is going on the list, you need to be upfront about it — and not just in a vague, easily missed way. The disclosure should be at the top of the article, before the list starts, and it should say something direct:

"Full disclosure: [Your Product] is our product. We've included it because we think it's a strong option for [use case] — but we've also tested every other tool on this list and tried to give an honest picture of each."

That's really all it needs to say. The point is that readers know your position before they start reading, not after they've already absorbed your recommendations.

Beyond the disclosure, there are a few things that will make or break whether the piece actually feels credible:

  • Include real competitors. This sounds obvious, but it's tempting to load the list with obscure or weaker alternatives that make your product look better by comparison. Readers who know the space will notice immediately. List the three to five tools people are actually choosing between, even if some of them are better than yours in certain areas.
  • Give your product real cons. Every entry needs genuine drawbacks, including yours. If your product's review reads like a feature highlight reel while everyone else gets honest criticism, the whole piece falls apart.
  • Use "we" and "our" throughout. When you're talking about your own product, don't suddenly switch to third-person or refer to it the same way you'd refer to a competitor. Keeping "our product" language consistent throughout makes the bias visible — which, counterintuitively, is exactly what you want.

Done well, a self-inclusive listicle can actually be one of your strongest pieces of content. Done poorly, it reads like an ad that forgot to say it was an ad.

Step 7: Add evidence of genuine testing

This is the step that separates a real review from something that could have been written without ever opening the product.

The bar here isn't high — you don't need a lab report, you just need specifics. "We imported a 500-row CSV to test the bulk upload" is more convincing than "the import feature works well." "Load time averaged 2.3 seconds compared to 4.1 seconds on the competitor" is more convincing than "it felt faster." Concrete details are hard to fake, which is exactly why they work.

A few types of evidence that carry real weight:

  • Screenshots from your actual account — showing navigation, settings, real usage. Not the polished visuals from the product's marketing page. If your cursor is visible or your account name is in the corner, even better.
  • Specific metrics you measured — load times, export speeds, row limits, API call counts. Anything you actually clocked or hit a wall on.
  • Real scenario testing — describe the actual test. "We ran this with a team of five for two weeks" or "we tested the free plan to see what breaks first" gives readers something to anchor to.
  • Before/after results where relevant — if you were solving a specific problem, show what changed.
  • A video walkthrough — even a short screen recording embedded or linked goes a long way. Almost no one bothers, which makes it stand out.

What doesn't work: stock screenshots from the company's own site, descriptions that could apply to any tool in the category, or phrases like "in our opinion" with nothing backing them up. Vague language reads as a red flag now, not a neutral hedge.

The more specific you are, the harder your content is to replicate and the more useful it is to someone actually trying to make a decision.

Step 8: Optimize for visibility

Good research and honest writing get you most of the way there. But there are a few technical details worth getting right before you hit publish — the kind of stuff that's easy to overlook and quietly works against you if you don't.

Author byline. Put a real person on the piece, with a short bio that explains why they're qualified to write it. "Written by our content team" doesn't cut it anymore. If the author has genuinely tested the products, say so in the bio. Implementing author schema markup alongside this is worth the extra few minutes — it reinforces the authorship signal in a way Google can actually parse.

Top10VPN, for example, introduces their author as a recognized VPN expert who's tested hundreds of services and whose research has been featured in the BBC and The New York Times. The article also links to a detailed author bio and includes a fact-checked note.

Publishing and update dates. Use the real publish date, and only update the "last modified" date when you've actually made meaningful changes — new testing, updated pricing, swapped out a tool. Bumping the date to include "2026" in the title without touching the content is the kind of thing that used to work and now tends to backfire.

Internal linking. Your listicle shouldn't sit in isolation — it should connect to the broader topic cluster it belongs to. Three to five contextual links to related articles signal to Google that this page is part of a coherent content hub rather than a standalone piece chasing a keyword. Link naturally, in places where the related content actually adds something for the reader.

Schema markup. If you have real ratings from genuine testing, Review schema is worth adding. If you don't, skip it — misusing AggregateRating schema is one of those small things that can create bigger problems than it solves.

None of this is complicated, but skipping it is a common way for otherwise solid listicles to underperform. Get the content right first, then make sure the technical layer matches the effort you put in.

Step 9: Set up monitoring and maintenance

Publishing a listicle isn't the finish line. If anything, it's when the real work starts because the sites that get hit hardest by algorithm updates are usually the ones that published and moved on without ever checking what happened next.

The basics are all in Google Search Console. Check overall site impressions weekly, and pay attention to any drops that follow a new listicle going live. It's not always a direct cause-and-effect relationship, but it's worth knowing whether your broader traffic held steady after publishing.

From there, track your listicle URLs specifically — how they're performing week over week, and how that compares to your non-listicle content. If your roundups are consistently underperforming compared to the rest of the site, that's a signal worth acting on.

Also, check which search queries are bringing people to your listicles. Sometimes a piece ranks for something loosely related to your topic but not really what your site is about — and that's worth keeping an eye on.

On a monthly basis, run through this quick checklist:

  • site:yourdomain.com intitle:best — track your listicle count
  • Check for any listicles showing a steady traffic decline (refresh or consolidate)
  • Look for site-wide drops that might signal a broader quality issue
  • Scan whether any competitors in your space got hit by a recent update — useful early warning
  • Ask honestly: should I pause on new listicles and strengthen what's already there?

Step 10: Find your next listicle idea

Once you have the process down, the question becomes where to look for ideas that are worth the effort. A few places that tend to surface genuinely good opportunities:

Your own search data. GSC's query report often contains "best [topic]" queries you're already getting impressions for but haven't directly addressed. These are warm opportunities — Google already associates your site with that territory.

Competitor gap analysis. Tools like Rank Tracker let you see what listicle-style content your competitors are ranking for that you haven't covered. Look for gaps in your topic cluster, not just high-volume keywords.

Reddit and community forums. The questions people ask in subreddits, Slack groups, and forums in your niche are a reliable proxy for what your audience actually wants to know. "What's the best tool for X?" threads show you real demand before it shows up cleanly in keyword data.

Your existing content. Look at your highest-traffic educational articles and ask what "best X" question a reader might naturally have after reading them. A listicle that answers the next logical question your best content raises is already topically grounded.

The best listicle ideas aren't the ones with the highest search volume. They're the ones that fit naturally into what your site is already about, that you can genuinely research well, and that answer something your audience is actually asking.

Conclusion

The sites that got hit weren't penalized for publishing lists. They were penalized for publishing lists that existed to rank rather than to help.

Follow the steps in this guide and you're already doing something different. The bar isn't as high as it seems after a big update. It just requires doing the work most people skip.

Article stats:
Linking websites N/A
Backlinks N/A
InLink Rank N/A
Data from Seo SpyGlass: try free backlink checker.
Got questions or comments?
Join our community on Facebook!