There is no need to convince you that an SEO audit is important. You already know that. But it still gives you shivers sometimes, right?
Today, I invite you to break the daunting SEO audit down so that it becomes a no-brainer.
Here is a list of all the things you should check on a site to avoid sudden ranking drops. Follow it and search engines will never pick on your site.
You can also download a PDF cheat sheet for quick reference whenever you need it.
– Affected aspects: Indexing, user experience, brand reputation, security, rankings –
Let’s start with a domain name audit as it determines your entire online presence. You need to make sure your audience can easily remember your website’s address to easily find it later on.
When choosing a domain name, you may have to buy an existing one. Before the purchase, make sure to trace the history of your future domain. It may turn out that this domain was involved in spammy activity, was penalized before, or had a bad reputation among users. This may influence your future site rankings.
To learn how this site looked before and what content it was delivering, use Internet Archive: Wayback Machine. With it, you can check if the site was full of spammy or, vice versa, quality content. Maybe, it was redirected to inappropriate sites.
Besides, you can use the WhoIs service to check out who was the previous owner(s) of the domain.
If several versions of your site coexist at the same time, make sure users get access to only one of them. For example, there may be several versions of your site.
First, it may be www – non-www versions:
Second, it may be HTTP – HTTPS versions:
Each of these versions is considered a separate site. Hence, only one of them should be made a master version. Otherwise, search engines will index several sites, which in turn will cause duplicate content issues. This may eventually affect your site rankings.
To see if that’s the case on your website, check the Redirects report in WebSite Auditor. For that, go to Site Structure > Side Audit > Redirects and check out the first two factors:Download WebSite Auditor
If there are any issues, these two factors will be marked with a red Error icon.
Note: The multiple versions issue can be prevented in the following way:
When you have an already established name in a niche (either big or small), that may be used by fraudsters. How?
When users make a typo (it may be a misspelling, another TLD, hyphenated spelling, etc.) while typing in your domain name, they may accidentally end up on an alternative website set up by some cyber criminals. As a result, your business may suffer losses due to traffic redirection. Besides, your reputation may be hurt if your name becomes associated with some malicious practices.
Note: You can also register all the possible domain name variations and redirect all of them to the correct version. For example, with our site we do the exact thing: if you type in https://linkassistant.com, you will end up on https://link-assistant.com anyway.
– Affected aspects: Crawlability and indexing, user experience, rankings, revenue –
If your site is larger than a couple of pages and is growing bigger, revising its structure once in a while becomes a must.
Make sure that you organize the content on a site in such a way as to both satisfy users and search engines. Your task here is to check if the relationships between pages on a site (its taxonomy) are logical.
Thus, make sure there are:
It’s only manually that you can audit and revise your site taxonomy. However, base your approach on data from Google Analytics (Behavior flow report, Pages per Session, Bounce Rate) and keyword research.
Watch out for your important pages’ click depth. From search engines’ perspective, the placement of pages in a site structure is a signal of importance. Thus, a page that’s buried too deep in a site structure tree gets less weight (unless it’s got a lot of backlinks).
To check your important pages’ click depth, move to the WebSite Auditor’s Pages report (the same Site Structure module). Find your top pages (sort pages by Organic Traffic or any other important metric) and check out their click depth.Download WebSite Auditor
If you find a strategically important page with a click depth of 4 and more, consider re-organizing your site structure so that this page gets closer to the homepage.
– Affected aspects: Crawling and indexing, PageRank distribution, rankings –
Internal linking is deeply connected to site structure and greatly depends on it. However, there are things that you should consider besides it.
Some case studies clearly show that the amount of links correlates with the traffic a page gets. Thus, the more internal links the better. However, the rule is true to a certain point – when it starts to look unnatural and spammy, it’s a no.
Moreover, links stand out. If there are too many blue links and colored buttons, the content becomes a bit stuffy and indigestible. We’ve got enough visual noise already.
On top of all, too many internal links coming from a page dilute its own link juice. So, you need to check the following:
To do that, in WebSite Auditor, go to Site structure > Pages > Links and technical factors. Here, check the Links to Page and Links from Page columns:Download WebSite Auditor
A pro tip: Distribute internal links in such a way as to pass PageRank from the most weighty pages to the ones that don’t rank that well. This way, you will balance the PageRank of your pages and improve the overall site rankings.
Now check the context your links are surrounded by. John Mueller said anchor text was important for ranking. So, here is what you should check to make sure your anchor texts work for your site and do not spoil everything:
To find this information, go to WebSite Auditor’s > Site structure > Pages > Links & technical factors.Download WebSite Auditor
Click on any page and below you will find all its anchor texts.
Broken links are hard to spot without a special audit. The reasons why broken links may appear:
To find broken links, find the corresponding column in the same Pages report in WebSite Auditor.Download WebSite Auditor
Finally, you need to find out if there are orphan pages on the site – the ones that are not linked to. Such pages may not appear in the Google Index.
The easiest way is to quickly check it via the Visualization tool in WebSite Auditor. They will be marked in gray.Download WebSite Auditor
Navigation is also about internal linking. If set right, it allows users to easily find the needed content on your site. When you check the navigation on your site, pay attention to the following things:
– Affected aspects: User experience, rankings –
Your URL structure is mostly determined by your site structure. However, there are still things you should watch out for.
If the URLs are super long, they are not user-friendly. This may result in a poor user experience. Compare:
You can check your lengthy URLs in WebSite Auditor’s Site Audit report. Find the URLs section > Too Long URLs:Download WebSite Auditor
If you see Error or Warning, think if you could shorten the URLs.
Note: Always choose the simplest URLs possible so that even if your page is located far from the homepage, its URL doesn’t look like example 1.
URLs that contain special symbols (?, _, &) and different parameters are also considered neither user nor SEO-friendly. They may be hard to perceive and can cause duplicate content issues. However, such URLs are inevitable if you have faceted navigation or/and pagination. Or when you need to track session IDs and website traffic, for example.
To check your URLs for dynamic URLs, go to WebSite Auditor’s Site Audit report and in the same URLs section, find Dynamic URLs.Download WebSite Auditor
Again, if the issue urgently needs a fix, it will be marked as Error. If it's an orange Warning, consider tackling the issue in the near future too.
– Affected aspects: Rankings, brand reputation –
Now let’s check your content for issues that may be stopping you from reaching the top of the SERPs.
The amount of content should be checked to spot the issues of thin content. This issue becomes real when you have too little content on a page, thus providing no value for users.
To prevent that from happening, you first need to check the number of words on each page. In the same Pages report of WebSite Auditor, check the Word Count column.Download WebSite Auditor
To streamline the process, filter out pages with more than, say, 300 words (300 is a ballpark figure, which depends on your site niche).
Then look through all the pages you just filtered and check content on each of them. Are these words enough to convey a message and help users? If yes, leave the page as it is. If not, consider updating it with more quality content.
Sometimes, content may underperform or its rankings may decline with time. You need to regularly check your site for such low-hanging fruits and optimize these pages so that the resources aren't wasted in vain.
You can find such pages in Google Search Console. Go to the Performance report > Pages, click on the Filter icon and set Positions > Greater than > 11. Thus you will find pages that don't perform well enough to appear on the first SERP.
Once detected, make sure to optimize this content further – elaborate on the topic more, add keywords, revise technical SEO aspects like meta titles and descriptions, H1 – H6 tags, etc. To dive deeper into the topic, read our 8-Step Guide for Full Website Content Audit.
Note: Use WebSite Auditor’s Content Editor to make sure you optimize your content enough to beat your SERP competitors.
I bet there are a couple of topics that your competitors have covered and you missed. We need to close the gap
To spot not yet covered topics, use Rank Tracker’s Content Gap Analysis tool. Go to Keyword Research > Keyword Gap, and add your main competitors:Download Rank Tracker
You will get a list of keywords you should create and optimize your future content for.
Some information tends to become irrelevant and untruthful. And not taking care of it may harm your reputation and rankings as users are not likely to return to websites with outdated content.
You can’t actually use any automation tools here to spot such content. You, however, may find all the articles that contain the past years in titles in WebSite Auditor using filters.Download WebSite Auditor
Note: If you make changes to your application or service, it’s also worth updating your manuals and visuals accordingly so that you do not confuse newcomers.
To send these E-A-T signals to search engines, there should be specific content on your site. Here we talk about the author bio pages with links to social media profiles. Besides, don’t forget to update your About us and Contact us pages with all the necessary information about your company.
Moreover, make sure that you don’t mix YMYL and non-YMYL content on one site. Google says it may confuse them when ranking a page.
Unfortunately, there is no way to track E-A-T issues automatically. However, you can track your Domain Strength in Rank Tracker and compare it to your competitors. It will give you a rough estimation of how authoritative your site is for search engines.Download Rank Tracker
Pop-ups are important for marketing. Plus, there are pop-ups that you must put on your site (like cookies consent). However, if they overlap your content and it becomes less accessible, a pop-up presence may result in a poor user experience. If they become intrusive, that may be noticed by search engines and your rankings may be lowered.
Check the sizes of pop-ups so that they do not overlap much content, especially on mobile devices where the screens are super small.
You should additionally check the Page Experience report in Google Search Console to make sure that you optimized your pop-ups for Web Vitals correctly.
It happens sometimes that disreputable sites steal your content and publish it under their names. And it is unlikely that Google will penalize them for scrapped content. As a result, you will rank with this another page for the same keywords and that’s where you may feel the consequences.
To check your site’s content for external duplicates, you can use services like Copyscape. They were created to detect plagiarism. Or, you can use social listening tools like Awario – you can specify a certain phrase from your content as a target and track all the exact matches on the web.
If you find out that your content has been stolen, contact the intruders with your cease and desist request.
– Affected aspects: User experience, site speed, rankings –
Images are just as important as text in terms of content (for ecommerce sometimes, even more important). And it’s not less important than text – they are more visible and technically, they are the largest part of a page.
Below, I’ll walk you through the most important aspects of image optimization. But if you want to learn all the details of image SEO, read our Image SEO Optimization.
You need to make sure your images are in the correct format. it should be PNG, JPEG, WebP, or AVIF.
As for the image size, I’d like to say the less the better. But it’s not quite true. You should opt for the lesser size until it damages the quality of the image and the objects in it become not distinguishable. Consider compressing your images before uploading them to your site. Most image compressors will let you significantly reduce the size of an image without compromising its quality.
You can quickly check the format and size of your images in WebSite Auditor. Go to Site structure > All resources > Images and quickly scan the list:Download WebSite Auditor
Use filters to speed up the search for under-optimized images. For example, its size is more than 200KB.
Just like with the URL structure, the image file structure is no less important. First, the image name and its URL path help search engines better understand what your image is about. Second, images’ names are a part of the user experience – when they save any of your images, it's great if they are correctly named and not confusing. Compare:
However, don’t rush to change your image file names if you spot some not-very-optimized URLs. Google claims it may take months for them to crawl new image URLs since they are crawled not so often as pages. Therefore, it would be better not to tune hundreds of images already uploaded to your site, but to take note of all the above for future images on your site.
Alt text also hints to search engines what an image is about. In fact, it's an even stronger signal than the image name and file structure. So, you should check your pages for empty or badly written alt tags.
You can check all your site's alt texts in WebSite Auditor > Site Structure > Site Audit > Images > Empty Alt Text.Download WebSite Auditor
An image is considered broken if:
In these cases, users see such images as follows:
To detect broken images on a site, move to the Broken images factor.Download WebSite Auditor
– Affected aspects: Rankings, brand awareness –
Backlinks are one of the most important ranking factors. It makes up your site authority and directly affects your site rankings.
First, you need to check the overall number of your backlinks and track the progress you’ve made over time.
Quickly check that out in SEO SpyGlass > Backlink Profile > Summary.Download SEO SpyGlass
It’s also worth comparing your figures to your competition's. It will give you an understanding of your place in a competitive landscape. To get that done, in SEO SpyGlass, go to Domain Comparison > Summary.Download SEO SpyGlass
Note: You can also check your competitors’ linking domains to spot some backlink prospects in the Link Intersection module.
The quality of backlinks plays a significant role in SEO. If you get primarily low-quality backlinks, it may do more harm than good.
That’s why you need to check how authoritative the sites linking to you are. First, open SEO SpyGlass and go to Backlink Profile > Backlinks. There, check both the Domain inLink Rank, and the inLink Rank of a specific page linking to your website.Download SEO SpyGlass
If you see too many links marked in red, consider disavowing them in order to avoid Google penalties.
Additionally, check the Penalty Risk each linking domain and page brings up. For that, go to Backlink Profile > Penalty Risk. There might be a number of reasons why a site/page presents a high penalty risk. You can see the exact reason by clicking the ⓘ icon near each penalty score.Download SEO SpyGlass
Anchor texts are also a signal of relevance – they provide more context for search engines and users as to what a linked page is about. That's why it’s important to keep track of your anchor texts.
The matter is that if there are too many irrelevant or too generic anchor texts, it may look like spam activity. Such links may be considered low quality and won’t bring your site any good.
Check your domain’s anchor texts in SEO SpyGlass > Backlink Profile > Anchor Texts:Download SEO SpyGlass
From time to time, check your backlinks growth for unusual spikes. A rapid growth of backlinks may indicate that your site has undergone a negative SEO attack. For example, your competitors can deliberately set a bunch of spam links on you to lower you in the search results.
Of course, it won’t necessarily work out for them (Google may simply ignore such spam links). However, it won’t be superfluous to periodically check that out.
For that, go to SEO SpyGlass > Historical Data. First, find Backlinks, set the needed data range, and see how your backlinks grew. You may notice unusual spikes with a day’s precision.Download SEO SpyGlass
– Affected aspects: Rankings, user experience, revenue –
If your business operates in several markets and has an international website, you might have already implemented site localization to target different audiences. If so, you need to audit your localization implementation.
rel= “alternate” hreflang setup is required for localization to show relations between pages. If it’s implemented incorrectly, that may cause a number of issues like duplicate content, de-ranking, and others.
Here are the things to audit:
To check your site localization issues, proceed to the Localization report in WebSite Auditor > Site Audit.Download WebSite Auditor
Go through each point to make sure your international SEO works like a charm.
Correct localization isn’t limited to technical aspects only. Cultural peculiarities should also be considered. Most likely content created for individualistic cultures won’t be understood or correctly perceived by some of the collectivist cultures.
Even design and layout should be localized. For example, Arabic audiences won’t understand your left-to-write writing.
Besides, in different countries, they have different search habits. They may use different search phrases and different search engines. Yes, in some countries they don’t use Google. For example, in China, it’s always Baidu, and in Korea – Naver.
So, make sure you revise your content optimization strategy as well. Do that with Rank Tracker as it shows your rankings as if searched from a specific location. You can specify the preferred search engines and then add a preferred location to track rankings more precisely.
– Affected aspects: Crawlability and indexing, user experience –
Redirects may affect both indexation of your pages and user experience. That’s why let’s pay due attention to it.
The most common mistake even professionals make is misusing 301 and 302 redirect types. Mostly they’re made because in most cases, 302 is set by default until you directly specify the 301 redirection. The first one is permanent and the second one is temporary.
And here lies the problem: if you use a 301 redirect, search engines stop indexing the old URL and some of its link juice is passed to the new destination.
Conversely, if you use a 302, search engines may continue to index the old URL, and consider the new one as a duplicate, dividing the link juice between the two versions. That may hurt your pages' rankings.
To detect any redirect issues, go to the corresponding section in WebSite Auditor > Site Structure > Site Audit and check out the following:Download WebSite Auditor
Everything that is marked in red should be fixed as soon as possible.
Any redirects are a certain burden for your site. Their excessive amount may hurt your site speed and if your pages are well-interlinked, that may significantly complicate crawling and indexing.
To check how many pages on your site are redirected, go to WebSite Auditor, find Site Structure > Pages and apply filters for HTTP Status Code to be = 302 or 301.Download WebSite Auditor
To quickly check if the number of redirects can affect your site speed proceed to Site Structure > Site Audit > Page Speed and find Avoid multiple page redirects. Alternatively, you can find the same information in Google Search Console (see Experience > Core Web Vitals).
If page 1 redirects to page 2 and this one, in turn, redirects to page 3, and so on, you have a redirect chain. And if a redirect chain ends up with an initial URL, it’s a redirect loop.
As a rule, redirect chains and loops are created accidentally and are just a waste of resources (link juice, crawl budget, page speed).
Fortunately, they can be easily detected by WebSite Auditor in the same Redirects report:Download WebSite Auditor
Note: If the issue occurs, just redirect from the initial page to the destination page passing all the intermediate “hops”. And if there is a loop, just remove all the redirects.
– Affected aspects: Security, user experience –
Security is a ranking factor and should be maintained no matter what. Here are the basic things that should be audited first-hand.
You may purchase it once but make sure it’s being timely updated. Otherwise, you users will get a notification like this one:
You can use any SSL checker to see if your certificate is fine. Also, don’t forget to set up notifications about the upcoming renewals.
To check if there are any HTTPS issues on your website, go to Google Search Console > Experience report > HTTPS.
Besides the fact that your site may not be served over HTTPS, there can be another issue of mixed content – when HTTP and HTTPS are met on one page. It weakens the security of the whole page.
To check your website for such type of issue, use WebSite Auditor. Go to Site Structure > Site Audit> Encoding and technical factors > HTTPS pages with mixed content issues:Download WebSite Auditor
– Affected aspects: Site speed, user experience, rankings –
Core Web Vitals are not only about speed as many think, it’s about overall user experience – how fast pages load and how responsive and stable they are.
LCP reflects the render time of the largest image or text block visible within the viewport, relative to when the page first started loading. In plain words, this is the metric that shows how fast it takes for content to download.
Ideally, our LCP should be less than 2.5 seconds. The time greatly depends on your:
You can check LCP metric for each page in WebSite Auditor: from Site Structure, move to the Pages report > Page Speed:Download WebSite Auditor
You will see the list of all your pages and their LCP scores. Those requiring improvement will be marked in red.
The responsiveness of your pages is measured with FID. Basically, this metric reflects the time it takes for a server to respond to a user’s first interaction (click on a button or link) with your site while it is loading.
Ideally, it should be 100 ms and less. What things may worsen FID:
Again, check FID for each page in WebSite Auditor. In the same workspace (Site Structure > Pages > Page Speed) find the First Input Delay column:Download WebSite Auditor
CLS measures every unexpected layout shift that occurs during the entire lifespan of a page. Such a layout shift happens when a visible element changes its start position.
What may worsen CLS:
To check CLS for your pages, check the same name column:Download WebSite Auditor
Note: There is a way to check your CWV for your whole site in Google Search Console. For that, go to Experience report > Core Web Vitals. You will immediately see how many pages need better optimization:
Alternatively, you can use WebSite Auditor to check your Core Web Vitals in bulk. From the Site Structure module, go to Site Audit > Page Speed. You will get not only the list of pages that do not pass CWV assessment but also a list of recommendations that will help you improve these metrics.Download WebSite Auditor
I also recommend reading our case study How We Improved Core Web Vitals & What Correlations We Found to learn from SEO PowerSuite’s own experience.
– Affected aspects: User experience, rankings –
Mobile friendliness is the gold standard for a quality website. However, I suggest you first check your mobile traffic and other metrics (like Bounce Rate and Conversions) to understand how much mobile traffic you get and whether you meet the needs of mobile users.
Note that even if you don’t have so many mobile users, working on the mobile-friendliness of your website is crucial anyway. Since Google sticks to mobile-first indexing, it’s the mobile version of your site that Google sees and on the basis of which it makes ranking decisions.
You can do that in Google Analytics > Audience > Mobile > Overview:
Then check the technical aspects of mobile-friendliness.
Whatever option you’ve chosen, make sure you audit the corresponding aspects:
Note: From an SEO perspective, responsive design is a preferred option. So, choose it over others.
This is something that is easy to check with your eyes using Device Mode in your browser. Pay attention to how the following things look on different devices:
You can check out your mobile usability for any issues in Google Search Console > Experience > the Mobile Usability report. If something is wrong, there will be details:
Alternatively, you can check that out separately for each page with Mobile-Friendly Test.
– Affected aspects: Crawlability and indexing, site speed, rankings –
Now, let’s audit your code and script as these are the technical factors that may severely impact your SEO.
You need to keep your code clean and neat. If there are unnecessary elements (for example, multiple head elements, etc.), it may slow down page speed.
Once detected, remove this unused code to speed up your page load.
You can’t do any SEO or marketing without analytics tools. If something is wrong with it, tracking becomes impossible, or even worse, your data can get into the hands of 3rd parties.
So you need to study your source code for the analytics snippets and make sure they are set up correctly.
If you have similar content on several pages of your site, search engines won’t ever understand what page to rank till you tell them. That’s why you need the rel=”canonical” element in your page markup. It tells search engines which version should be given preferences.
There may be broken canonical links or multiple canonical URLs.
To detect if there are any, go to Site Structure > Pages and add the needed columns in WebSite Auditor’s workspace – Rel Canonical and Multiple Rel Canonical URLs:Download WebSite Auditor
Titles and descriptions not only should be present on your page, be descriptive and contain a keyword. There also should be no duplicates among them. Besides, both titles and descriptions should be of optimal length.
You can spot issues with your meta titles and descriptions in WebSite Auditor > Site Audit > On-Page:Download WebSite Auditor
H1-H6 tags’ aim is to inform search engines of what your page is about, and what its structure is. Plus, they help users navigate the page. Examine your H tags and make sure:
You can conveniently audit your H1-H6 tags in WebSite Auditor > Site Structure > Pages > the On-page tab. You just need to add the appropriate columns in your workspace.Download WebSite Auditor
Robots meta tags (both the meta robots tag and the x-robots tag) help control crawling and indexing. With their help, we tell search engines if we want them to follow links found on a page and index page and images found there. Sometimes, meta robots tags are used to control snippets and to show cached results on SERPs.
These are the most common values added to robots tag:
Very often, these tags are implemented incorrectly. For example, some important pages might be tagged as noindex or a page may be specified in the robots.txt file and tagged as noindex simultaneously (which makes the noindex tag inefficient).
To avoid possible issues, check if any of your papes with a noindex tag got into the robots.txt file. For that, go to WebSite Auditor > Site Structure > Pages. Add the Robots Instructions column to your workspace to see those pages.Download WebSite Auditor
Make sure you rebuild the project, enable expert options, and unclick the Follow robots.txt instruction option so that the tool can see the instructions but not follow them.Download WebSite Auditor
Structured data is needed to embrace all your search appearance opportunities. It helps search engines to understand your page faster. Besides, using structured data may become your chance to get yourself rich snippets instead of plain blue links (which may result in higher click-through rates).
For different types of pages, there will be their own markup, so we’ll focus on auditing for faults and finding opportunities.
Go to Google Search Console > Enhancements to see if all markups on your website work as intended. If you stumble across any invalid items, make sure to check the reasons behind that.
If you need to audit a specific page, use Rich Results Test.
It may be that you miss out on something you can implement but haven’t done yet. You need to detect the features that you can optimize for.
First, check which pages have structured data markup and which do not in WebSite Auditor: Site Structure > Pages > the Open Graph & Structured Data Markup tab:Download WebSite Auditor
For those pages that aren’t marked up, think of any possibilities. For example:
Once opportunities are detected and you are ready to implement the markup, use Schema Markup Validator to make sure you did everything correctly.
– Affected aspects: Crawlability and indexing –
Once the rest of the possible issues have been worked out, you need to look at the things that may prevent your page from appearing in search results. And one of the basic things is an XML sitemap, of course.
There can be several issues here:
If there is no sitemap, it’s not an issue as such. However, without it, the crawler doesn't know what pages to prioritize. You may want to instruct Google what pages not to crawl and how often your pages are updated so that they are crawled accordingly.
And if you have a huge website with a complicated structure and high click depth or an international one, you can’t do without an XML sitemap.
If you run a large site, segmenting your sitemaps by sections is a good SEO practice.
By creating sub-sitemaps, you manage the crawl budget more effectively. That’s why make sure you have several sitemaps based on how static your pages are.
There are a dozen possible faults that may cause issues with your sitemap. It may be due to the wrong format or wrong HTML tags or incorrect sitemap URL.
You can observe your sitemaps in Google Search Console > Index> Sitemaps.
If there is something wrong with your sitemap, its status will be corresponding.
A sitemap may become outdated (when some pages have already been removed from your site or redirected, but they are still on the sitemap ) or you simply might put the wrong URLs onto your sitemap.
So first of all, you need to check your sitemaps for pages that shouldn’t be included:
Note: Generate your XML sitemaps right in WebSite Auditor’s Sitemap Generator to avoid any mistakes.
– Affected aspects: Crawlability and indexing –
Finally, the last thing to check is your robots.txt file. It enables you to block the crawling of certain pages. Here are the common issues that might appear:
You may accidentally block the wrong page. Or it may be so that you deleted a page or set a permanent redirect but the page remained in a sitemap file.
To see what pages are blocked from crawling, go to Site Structure > Site Audit > Indexing and Crawlability. Find the Resources Restricted from Indexing factor and check out the list of pages and their robots instructions.Download WebSite Auditor
If you see that some important pages were blocked, fix the issue by removing the corresponding robots.txt rule. You can manage your robots.txt file right in Website Auditor.
There also may be another issue: you block a page but it’s still crawled and indexed because the page is well-interlinked.
You can check what pages are blocked in the robots.txt file but still are linked to with Website Auditor.
For that go to Site Structure > Pages and look over the pages, their robots instructions, and whether they have any internal links.Download WebSite Auditor
Note: You can use Google’s robots.txt Tester to spot any issues that occurred.
Proactive diagnosis, though terribly routine, is better than no-doubt-exciting rehabilitation of lost rankings.
As you can see, there are plenty of things that can be managed incorrectly in terms of SEO. So, don’t wait until something bad happens to your site, run an SEO audit regularly. Download the PDF to keep the cheat sheet at hand when needed.