Technical SEO Checklist: 12 Steps to a Technically Perfect Site

Technical SEO Checklist:
12 Steps to a Technically Perfect Site
By: Masha Maksimava                                                                 Updated September 14, 2020
 

We discuss off-page SEO a lot. Building, managing, and auditing your backlinks is a critical aspect of SEO, and it keeps getting trickier. On-page SEO is a hot topic too; particularly now in the era of Google's semantic search, when old-school SEO tactics don't work as good as they used to.

No doubt those are very important aspects of SEO strategy, but there's one thing we tend to forget about. SEO isn't either off-page or on-page. The technical part of the process is just as important; in fact, if you don't get the technical foundation right, your other SEO efforts might bring no results at all.

Offpage, onpage and technical SEO

In today’s SEO guide, I'll focus on the main aspects of technical SEO and SEO audit tools that will help you maximize usability, search engine crawling, indexing, and ultimately improve rankings of your site.

I suggest going step by step through our technical SEO audit checklist:

 

1. Check indexing.

First, let's see how your site is indexed by search engines. The quickest way is to enter the “site:https://www.domain name” in the search box.

Site search with operators

However, the results will be fine for minor sites with around 500 URLs. To check indexing of larger sites, you can use the SEO audit tool from WebSite Auditor. Fire up the tool, enter the URL of your site to create a project, and jump to Domain Strength to see indexing checked.

Site indexing check with Website Auditor

Ideally, this number should be close to the total number of your site's pages (which you can see under Site Structure > Pages in your WebSite Auditor project) minus the ones you deliberately restricted from indexing. If there's a bigger gap than you expected, you'll need to review your disallowed pages. Which brings us to a more in-depth SEO audit.

 

2. Find resources restricted from indexing.

You might be tempted to simply look through robots.txt file to make sure your important pages are crawlable. But in reality, your robots.txt file is only one of several ways to restrict resources from indexing. What about the noindex meta tag, X-Robots-Tag, or orphan pages that aren't linked to internally? What about your JavaScript and CSS files that could be critical to rendering your content? For a comprehensive crawlability check, you will definitely need a technical SEO audit tool.

To get the full list of all blocked resources with WebSite Auditor SEO crawler, jump to Site Structure > Site Audit, and click on Resources restricted from indexing.

Check in Site Audit resources blocked from indexing

If any of the resources on the list aren't supposed to be blocked, check with the Robots instructions column to see where the disallow instruction was found, so you can fix it quickly.

Configure Crawler settings in Project Preferences or at rebuild
 

3. Inspect your URLs in Google Search Console.

The general rule is that URLs should be SEO-friendly: that is, URLs have to be descriptive (containing keywords), rather short (not exceeding 75 symbols) and static (without “?”, “_” and parameters).

There is the URL inspection tool in Google Search Console to check if a specific page suffers from any SEO problems. The tool provides information on indexing issues, AMP pages related to the URL, structured data errors, checks for canonicals and duplicate URLs.

Website Auditor discovers URL issues

To find crawl errors across the site, head for the Coverage Report in the Google Search Console: it shows what URLs have been indexed correctly, validated with warnings, or blocked with robot.txt instructions, meta tags, canonical tags, etc. Click on the error to find the affected URLs, fix them and validate.

Coverage Report in Google Search Console provides crawl stats
URL parameters in old version of Google Search Console
 

4. Audit your site structure.

A shallow, logical site structure is important for users and search engine bots; additionally, internal linking helps spread ranking power (so-called link juice) among your pages more efficiently.

Visualize your site structure to see internal linking

Check the click depth of internal links.

As you audit your internal links, check the click depth. Make sure your site's important pages are no more than three clicks away from the homepage. In the WebSite Auditor jump to Site Structure > Pages. Then sort the URLs by Click depth in descending order by clicking on the header of the column twice.

Pagination of blog pages is necessary for discoverability by search engines, though it increases the click depth. Use simple structure along with powerful site search to make it easier for users to find any resource.

• Find orphan pages.

Orphan pages aren't linked internally — and thus are hard to find for visitors and search engines. This means that if search engines discover them at all, they are likely to crawl them pretty seldom. To detect them, rebuild your WebSite Auditor project by going to Site Structure > Pages and hitting the Rebuild Project button. At Step 2 of the rebuild, check the Search for orphan pages box and proceed with the rebuild.

Adjust crawler settings to specify search instructions
Once the rebuild is complete, filter your entries by the Orphan page tag.

For a full SEO guide to making the most of internal linking, see this post.

 

5. Check on your HTTPS content.

Google started using HTTPS as a ranking signal in 2014; since then, HTTPS migrations have become increasingly common. Today, according to Google Transparency Report, 95% of websites across Google use HTTPS.

If your site hasn't yet gone HTTPS, you may want to consider an HTTPS migration. If you do decide to go secure, feel free to use the framework from the case study of our own migration to HTTPS at link-assistant.com.

If your site is already using HTTPS (either partially or entirely), it is important to check on the common HTTPS issues as part of your SEO site audit. In particular, remember to check for:

• Mixed content.

Mixed content issues arise when an otherwise secure page loads some of its content (images, videos, scripts, CSS files) over a non-secure HTTP connection. This weakens the security of the page and might prevent browsers from loading the non-secure content or even the entire page. To check your site for mixed content issues, in your WebSite Auditor project go to Site Audit > Encoding and technical factors.

Redirect HTTP content on HTTPS pages to secure protocols

• Canonicals and redirects.

First, you’ve got to check for duplicate http vs https versions of your pages, in the Site Audit > Redirects section. If the HTTP and HTTPS versions of your website are not set correctly, they both can get indexed by search engines simultaneously. This will cause duplicate content issues which may harm your website rankings. Ideally, all links on your HTTPS site, as well as redirects and canonicals, should point to HTTPS pages straight away.

Make sure to set up proper redirect from HTTP to HTTPS, notice issues with www and non-www versions of your site.

Second, even if you have the HTTP to HTTPS redirects implemented properly on the entire site, you still don't want to take users through unnecessary redirects — this will make your site appear much slower than it is. Such redirects may be a problem for crawling, too, as you will waste a little of your crawl budget every time a search engine bot hits a redirect.

For a comprehensive list of all non-HTTPS resources on your site, jump to WebSite Auditor's All Resources dashboard. Click on HTML under Internal resources and sort the list by URL (by clicking on the header's column). This way, you should be able to see the redirected URLs of your HTTP pages first (301 status code). For every HTTP page you find, at the bottom of the screen, check the Found on pages list for a full list of pages that link to the HTTP page you're examining. Fix the unnecessary redirect by linking to the HTTPS version of the page.

Fix existing HTTP redirects.
 

6. Amplify your crawl budget.

Crawl budget is the number of site's pages that search engines crawl during a given period of time. Crawl budget isn't a ranking factor by itself, but it determines how frequently the important pages of your site are crawled (and whether some of them are being crawled at all). There is a Google legacy tool showing crawl budget. You can still access it in the old version of the console via Legacy tools and reports > Crawl Stats.

Old Search Console crawl stats section.

From the report above, I can see that on average, Google crawls around 7,900 pages of my site per day. From that, I can calculate that my monthly crawl budget is around 240,000 units.

Technically, you can influence the crawl rate factor, but only to make it smaller. You can set the limit on the crawl rate in case Googlebot makes too many requests per second, which slows down your server. On the contrary, when you notice a low crawl rate, try to submit a Sitemap (if there are many new URLs) or request indexing with the URL inspection tool (for only a few URLs). Typically, the more frequently you publish your blog posts, the more often Googlebot will visit your site.

• Take care of broken links.

When a search bot hits a 4XX/5XX page, a unit of your crawl budget goes to waste. That's why it's important to find and fix all broken links on your site. You can get a full list of those in the WebSite Auditor's Site Audit > Links section, by clicking on Broken links.

Fix broken links.

Broken links not only waste your crawl budget, but also confuse visitors and eat up the link juice. It's important to remember that apart from the <a> tags, broken links may hide in the <link> tags, HTTP headers, and sitemaps. For a comprehensive list of all resources with a 4xx/5xx response code, it's best to check on your site's resources in the WebSite Auditor's All Resources dashboard. Click on Internal resources and sort the list by HTTP Status Code (by clicking on the header's column). Now, click on any of the broken resources to see where the links to it hide.

Find missing resources under HTTP Status Code, link tags.

• Fix redirect chains.

Every redirect search engine bots follow is a waste of a unit of your crawl budget. Moreover, if there is an unreasonable number of 301 and 302 redirects in a row on your site, at some point the search spiders will stop following the redirected URLs, and the destination page may fail to get crawled.

You can get a full list of redirects in WebSite Auditor, along with the list of redirect chains found on your site. Just jump to the Site Audit dashboard and look for 302 redirects, 301 redirects, and long redirect chains.

Fix unnecessary redirects.

• Restrict indexation of pages with no SEO value.

Think about which pages of your site will make no sense for ranking in the search results: privacy policy, terms and conditions, old promotions, etc. Such information makes you a visiting card for marketing purposes and is located in the footer of your page or buried deep with high click depth. These are all good candidates for a disallow rule.

For more detailed tips on crawl budget optimization, jump to this comprehensive guide.

Once you know what your crawl budget is, you must be wondering how you can make the most of it. Which brings us to the next step of our technical SEO site audit: fix on-page SEO factors that eat up from your crawl budget.

 

7. Сlean up duplicate content with on-page SEO checks.

A significant part of SEO success depends on some on-page factors, mainly meta descriptions and title tags. Page titles have to tell search engines that they are relevant to the searched keywords. And if your meta description is empty, search engines will pick the most descriptive content on your pages (as search engines understand it) to show it in SERPs.

Duplicate content can do you a lot of harm to your SEO ranking. For a hint on duplicate content on your site, check with the On-page section in WebSite Auditor's Site Audit dashboard. The pages with duplicate title and meta description tags are likely to have duplicated content inside the blog post as well.

Rewrite duplicate content.

For every duplicate page you can get rid of — do it. If you have to keep the page, at least make sure to block it from search engine bots. In terms of crawl budget, canonical URLs aren't of much help: search engines will still hit the duplicates and waste a unit of your crawl budget every time. So, check out for duplicate or empty title tags and rewrite them. And make sure to create unique enticing meta descriptions of around 160 characters.

 

8. Test and improve page speed.

Google expects pages to load in two seconds or less, and they've officially confirmed that speed is a ranking signal. Site speed has a massive impact on UX: slower pages have higher bounce rates and lower conversion rates.

Page speed isn't merely one of Google's top priorities, it's also its ranking signal both for desktop and mobile results. To check if your pages pass Google's speed test, open your WebSite Auditor project and go to Content Analysis. Click Add page, specify the URL you'd like to test, and enter your target keywords. In a moment, your page will be analyzed in terms of on-page optimization and technical SEO. Switch to Technical factors and scroll to the Mobile Friendliness and Page Speed section to see if any technical problems have been found. The audit tool calculates the SEO score for page speed based on Google’s Web Core Vitals metrics.

Page speed test for mobile and desktop; keep in mind the mobile-first indexing.

If your page doesn't pass some of the aspects of the test, you'll see the details and how-to-fix recommendations in the right-hand view.

Another source of data on your site speed is Google Analytics. You can get the report with all necessary analytics and page speed suggestions in the Behavior > Site speed module.

Analyze site speed stats in Google Analytics.

9. Make your pages mobile-optimized.

In 2018, after a year and a half of careful experimentation and testing, Google started migrating sites to mobile-first indexing: the algorithms set search engines to crawl the mobile versions of websites instead of their desktop versions. And this literally means that the mobile version of your pages determine how they rank in both mobile and desktop search results. Currently, mobile-first indexing is enabled for all new websites by default (new on the web or not indexed previously on Google Search). For older or existing websites, Google continues to monitor and evaluate pages based on the mobile-first indexing best practices.

Here are the most important things to take care of when auditing your mobile site.

 

• Test your pages for mobile friendliness.

Google's mobile-friendly test includes a selection of usability criteria, such as viewport configuration, use of plugins, and the size of text and clickable elements. It's also important to remember that mobile friendliness is assessed on a page basis, so you'd need to check each of your landing pages for mobile friendliness separately, one at a time. You can quickly run the check in WebSite Auditor — Google's mobile-friendly test is incorporated right into the tool. In your project, go to the Content Analysis module, select a page you'd like to analyze, and enter your target keywords. When the analysis is complete, look at the Page Usability (Mobile) section to see if any errors or warnings have been found.

• Run a comprehensive SEO audit of your mobile site.

Having all your important pages pass Google's mobile test is a good start — but there's a lot more analysis to do. A full SEO audit of your mobile site is a great way to make sure that all your important resources are accessible to Googlebot and free from errors.

To do an in-depth mobile website audit, you'll need to run a site crawl with custom user agent and robots.txt settings. In your WebSite Auditor project, jump to the Pages dashboard and click the Rebuild Project button. At Step 2, enable expert options and make sure the Follow robots.txt instructions box is checked; in the drop-down menu next to it, choose Googlebot-Mobile. Right below, check the Crawl as a specific user agent box. In the drop-down menu to the right, pick the second user agent on the list:

Crawl as some specific bot and user agent.

That's the user agent Google uses when crawling mobile versions of pages. In a moment, the tool will conduct a full SEO audit of your mobile website. Remember that any SEO issues you find can equally affect your desktop and mobile rankings, so do look through the traditional SEO factors like redirect chains, broken links, heavy resources, duplicate or empty title tags and meta descriptions, etc.

For a full guide to mobile SEO, jump to this post.

 

10. Review your sitemap.

You surely know how important your sitemap is. This file tells the search engines about your site structure and lets them discover fresh content. (If you don't have a sitemap, you should really, really go and create one right now. You can do it in WebSite Auditor by simply starting a project for your site, jumping to the Pages dashboard, and hitting the Sitemap button.)

As you check your sitemap, make sure it is:

• Clean. Keep your sitemap free from errors, redirects, and URLs blocked from indexing; otherwise, you're at risk of search engines ignoring the sitemap like it isn't there.
• Up-to-date. Make sure your sitemap is updated every time content is added to your site (or removed from it) — this will help search engines discover new content fast.
• Concise. Google won't crawl sitemaps over 50,000 URLs. Ideally, you should keep it much shorter than that to ensure that your most important pages are crawled more often: technical SEO experiments show that shorter sitemaps result in more effective crawls.
• Registered in Search Console. Let Google know about your sitemap. You can either submit it manually to Google Search Console or specify its location anywhere in your robots.txt file in the following way: Sitemap: http://yourdomain.com/sitemaplocation.xml

Besides the technical site audit, Website Auditor’s Webmaster tools let you quickly generate your sitemap as well as robot.txt file. For more on sitemaps, refer to Google's guide.

 

11. Add structured data markup.

Structured data is a universal language format which tells search engines what your page is about. They are used for specifically organized content, such as reviews, listicles, FAQs, to make Google pick them up and show in SERPs. Thanks to enriched results in Google searches, it can bring more traffic, increase CTR and thus improve rankings of your site.

SERP results feature images on the page marked in structured data.

You can check your structured data in the Google Search Console: the URL  inspection tool will report on your rich snippets and which structured data are unparsable. You can also find such information in the Website Auditor tool, under the Pages > Open graph and structured data markup tab.

There are several different vocabularies to describe structured data, such as Schema.org, JSON-LD or Microdata. For more details about the schema section markup, jump on our brief SEO guide on structured data.

 

12. Ask search engines to re-crawl your site.

The above site audit will surely spotlight a few issues on your site that need fixing. Once you fix them, you can explicitly ask Google to re-crawl your pages to make sure the changes are taken into account asap.

You can send your updated URLs for recrawl from Google Search Console with the help of the URL inspection tool. Enter the URL of the page you want to be re-crawled and click Request indexing. You can also Test Live URL (previously known Fetch as Google feature) to see your page in its current form, and then request indexing.

The URL inspection tool allows expanding the report for more details, testing live URL and requesting indexing.

You’ve got to assess when you really need to force recrawl. For example, after introducing serious changes onto the site: you moved your site from http to https, introduced structured data or did essential content optimization; or you want an urgent blog post to appear on Google quicker. There is believed to be a limit on the number of recrawl actions per month, so don’t abuse it. Recrawl can take from 10 minutes to several weeks. A better alternative to recrawl is to submit massive changes via the sitemap. Google doesn't guarantee to index all of your site's pages, but if the site is fairly small, it most probably will.

There's a similar option in Bing Webmaster Tools, too. Just locate the Configure My Site section in your dashboard and click on Submit URLs. Fill in the URL you need re-indexed, and Bing will typically crawl it within minutes.The tool allows Webmasters to submit up to 10,000 URLs per day for most sites.

Do site audit on a regular basis

Last, but not the least. A regular site audit should be a priority in your SEO strategy. Whatever changes appear on the web, they might influence rankings in a most unpredictable way. That is why it is really wise to set SEO analyzer tools for regular technical checks to detect SEO problems and fix them as soon as they emerge. For example, you can run site audit automation with a professional version of Website Auditor: add a task to get a monthly SEO audit report with the criteria you need to control, like traffic, backlinks, etc.

In Website Auditor, you can customize the site audit report to get the data you need to monitor regularly (indexing, broken links, on-page, etc.).

Those are the most important technical SEO tips in 2020. What are your thoughts on the technical SEO as it is? Which tactics have you seen to be the most effective recently? Join the discussion and share your thoughts in the comments below.


By: Masha Maksimava