Technical SEO Checklist: 12 Steps to a Technically Perfect Site

Article stats:
Linking websites 69
Backlinks 212
InLink Rank 36.0
Data from: backlink checker tool.

SEO can be roughly divided into three branches: on-page, off-page, and technical. On-page SEO is content and HTML tags. Off-page SEO is mostly backlinks. And technical SEO is about the accessibility of your website.

Offpage, onpage and technical SEO

And although you need all three to rank well, one can argue that technical SEO actually comes first. You have to build a strong technical foundation for your website before you load it with content and promote it with backlinks. Otherwise it will all come crashing down.

In today’s SEO guide, we’ll focus on how to audit your website for technical issues, optimize your website structure, improve its speed and accessibility.

Step 1. See if your website is indexed

An obvious place to start is to see whether your website is indexed at all. And by indexed, I mean whether or not it appears in search. The first thing to do here is google your domain name and look for your website among search results.

Is your website among search results? Good. Now let’s see how much of your website has actually made it into the search index. A good place to check this is the Coverage report in Google Search Console:

 

The report shows how many of your pages are currently indexed, how many are excluded, and what are some of the indexing issues on your website.

Another place to monitor your indexing is the Domain Strength report in WebSite Auditor. The tool shows the number of pages indexed not just in Google, but in other search engines as well:

Site indexing check with Website Auditor.

Whichever tool you decide to use, the number of indexed pages should be close to the actual number of pages on your website. So, if you are running, say, an e-commerce website, then the number of indexed pages should correspond to the number of products you’ve uploaded.

Step 2. Fix indexing issues

Generally speaking, there are two types of indexing issues. One is when a page is not indexed even though it’s supposed to be. The other one is when a page is indexed even though it’s not supposed to be.

If you are using Google Search Console to audit your website, then the first type of indexing issue will usually be marked as an error:

 

Indexing errors happen when you’ve asked Google to index a page, but it is blocked. For example, a page has been added to a sitemap, but is marked with the noindex tag or is blocked with robots.txt. To resolve the issue check whether the page is supposed to be indexed. If yes, then remove whatever is blocking it. If no, then remove the page from your sitemap.

The other type of indexing issue is when the page is indexed, but Google isn’t certain it was supposed to be indexed. In Google Search Console these pages are usually marked as valid with warnings:

 

These types of indexing issues usually happen when you’ve tried to block the page using robots.txt instructions. It’s a very common mistake to think that robots.txt can block the page from indexing. In reality, Google is free to ignore these instructions. If you want to really block the page you have to use the noindex tag.

For a full list of potential indexing issues, I recommend switching to the Site Audit report of the WebSite Auditor. There, you will find indexing issues sorted by type. And on the right is the list of all affected pages and resources:

Indexing issues report from WebSite Auditor

Step 3. Audit your site structure

A shallow, logical site structure is important for users and search engine bots. Furthermore, internal linking helps spread ranking power (so-called link juice) among your pages more efficiently.

As you audit your internal links, check the click depth. Make sure your site's important pages are no more than three clicks away from the homepage. In the WebSite Auditor, jump to Site Structure > Pages. Then sort the URLs by Click depth in descending order by clicking on the header of the column twice.

Pagination of blog pages is necessary for discoverability by search engines, though it increases the click depth. Use simple structure along with powerful site search to make it easier for users to find any resource.

For a full SEO guide to making the most of internal linking, see this post.

Step 4. Amplify your crawl budget

Crawl budget is the number of pages that search engines crawl during a given period of time. Crawl budget isn't a ranking factor by itself, but it determines how frequently the pages of your site are crawled (and whether some of them are being crawled at all). 

Depending on how important your website is, Google will give you a certain amount of crawl budget. If your pages are light and your resources are easy to access, then this budget will go a long way. But if there are too many redirects and dead ends, then your budget will be spent before it even gets to important pages.

Here is what you can do to amplify your crawl budget:

Ignore low-priority resources

It is common to use gifs, memes, and videos to liven your pages up. These resources are meant for entertaining your users, but they do not actually matter to search engines. The good news is that you can ask Google to ignore these resources and go crawl something else.

To do that, edit your robots.txt file to disallow individual resources:

User-agent: *
Disallow: /images/filename.jpg

Or even certain file types:

User-agent: *
Disallow: /*.gif$

Avoid long redirect chains

When there are too many redirects, Google will usually stop following the trail and move onto other parts of your website. Which means some of your pages may never see the light of day. That’s on top of wasting a bunch of crawling units for nothing.

To stop this from happening, launch WebSite Auditor, go to Site Structure > Site Audit > Redirects and see if any of your pages have more than two redirects:

 

If you find any such pages, fix the redirects to go straight to the destination page.

Manage dynamic URLs

It’s common for content management systems to generate a ton of dynamic URLs. Basically, each of your pages can have a few different URLs depending on how you got there or the types of filters you’ve used. Google may view each of these URLs as a different page, even though the content is mostly the same.

To stop Google from crawling a bunch of nearly identical pages, you can ask it to ignore certain URL parameters. To do that, launch Google Search Console and go to Legacy tools and reports > URL Parameters:

 

Click Edit on the right and tell Google which parameters to ignore.

Take care of broken links

When a search bot hits a 4XX/5XX page, a unit of your crawl budget goes to waste. That's why it's important to find and fix all broken links on your site. You can get a full list of those in the WebSite Auditor's Site Audit > Links > Broken links.

Fix broken links.

Broken links not only waste your crawl budget, but also confuse visitors and eat up link juice. For a full list of all resources with a 4xx/5xx response code, it's best to check on your site's resources in the WebSite Auditor's All Resources dashboard. Click on Internal resources and sort the list by HTTP Status Code (by clicking on the header's column). Now, click on any of the broken resources to see where are they linked from.

Find missing resources under HTTP Status Code, link tags.

Step 5. Сlean up duplicate content

Duplicate content is not a huge problem for SEO, but it can get a little messy. If there are two or more nearly identical pages on your website, which one is to be indexed? What if Google decides that your duplicates are a sign of plagiarism or spam? It’s best not to risk it.

For a hint on duplicate content on your site, check with the On-page section in WebSite Auditor's Site Audit dashboard. The pages with duplicate titles and meta description tags are likely to have nearly identical content as well.

Resolve duplicate content.

If you find duplicate pages you can either remove them altogether or hide them from search engines.

Step 6. Test and improve page speed

Google expects pages to load in two seconds or less. Meanwhile, the average page speed across the web is around 15 seconds. So, chances are, you have to improve your page speed as well.

Page speed is not an easy thing to measure though, and Google has struggled with this metric for a long time. Finally, it has arrived at Core Web Vitals — three metrics designed to measure the perceived speed of any given page. The metrics are Largest Contentful Pain (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS).

Recently, all three metrics have been added to the WebSite Auditor. So, if you are using the tool, you can see your score on each metric, a list of page speed issues on your website, and a list of affected pages or resources:

Core Web Vitals report from WebSite Auditor

All you have to do is follow the recommendations on the right and your page speed will be up in no time. Conversely, if you see many pages affected by the same issue, it’s likely the issue is sitewide and can be resolved with a single fix. So it’s actually not as much work as it seems.

Step 7. Assume mobile-first approach

In 2018, after a year and a half of careful experimentation and testing, Google started migrating sites to mobile-first indexing. The algorithms set search engines to crawl the mobile versions of websites instead of their desktop versions. And this literally means that the mobile version of your pages determines how they rank in both mobile and desktop search results.

Google's mobile-friendly test includes a selection of usability criteria, such as viewport configuration, use of plugins, and the size of text and clickable elements. It's also important to remember that mobile friendliness is assessed on a page basis, so you'd need to check each of your landing pages for mobile friendliness separately, one at a time.

To assess your entire website you better switch to Google Search Console. There, under the experience tab, you will find Mobile Usability report for all of your pages:

 

Under the graph, there is a table with the most common issues affecting your mobile pages. You can investigate further by clicking on any of the issues.

Step 8. Check on your HTTPS content

Google started using HTTPS as a ranking signal in 2014. Since then, HTTPS migrations have become increasingly common. Today, according to Google Transparency Report, 95% of websites across Google use HTTPS.

If your site hasn't yet gone HTTPS, you may want to consider an HTTPS migration. If you do decide to go secure, feel free to use the framework from the case study of our own migration to HTTPS at link-assistant.com.

If your site is already using HTTPS (either partially or entirely), it is important to check on the common HTTPS issues as part of your SEO site audit. In particular, remember to check for:

Mixed content

Mixed content issues arise when an otherwise secure page loads some of its content (images, videos, scripts, CSS files) over a non-secure HTTP connection. This weakens the security of the page and might prevent browsers from loading the non-secure content or even the entire page. To check your site for mixed content issues, in your WebSite Auditor project go to Site Audit > Encoding and technical factors.

Redirect HTTP content on HTTPS pages to secure protocols

Canonicals and redirects

First, you’ve got to check for duplicate HTTP vs HTTPS versions of your pages, in the Site Audit > Redirects section. If the HTTP and HTTPS versions of your website are not set correctly, they both can get indexed by search engines simultaneously. This will cause duplicate content issues which may harm your website rankings. Ideally, all links on your HTTPS site, as well as redirects and canonicals, should point to HTTPS pages straight away.

Make sure to set up proper redirect from HTTP to HTTPS, notice issues with www and non-www versions of your site.

Second, even if you have the HTTP to HTTPS redirects implemented properly on the entire site, you still don't want to take users through unnecessary redirects — this will make your site appear much slower than it is. Such redirects may be a problem for crawling, too, as you will waste a little of your crawl budget every time a search engine bot hits a redirect.

Step 9. Add structured data

Structured data is essentially HTML code used to tag specific elements on your page. For example, if your page is an apple pie recipe, you can tell google which text is the ingredients, which is cooking time, calorie count, and so forth. Google can then use the tags to create rich snippets for your pages in SERP:

 

You can check your structured data in the Google Search Console under the enhancements tab. Google will display the enhancements you’ve tried to implement on your website and tell you if you’ve succeeded:

If you haven’t implemented Schema yet, here is our SEO guide on structured data. If your website is using a CMS, structured data may be implemented by default or you could implement it by installing a plugin. If you have a custom-built website, you can use a helper tool to tag your pages.

Step 10. Review your sitemap

Your sitemap tells search engines about your site structure and lets them discover fresh content. If you don't have a sitemap, you should really, really go and create one right now. You can do it in WebSite Auditor by simply starting a project for your site, jumping to the Pages dashboard, and clicking the wrench icon to create a sitemap. 

As you check your sitemap, make sure it is:

Clean. Keep your sitemap free from errors, redirects, and URLs blocked from indexing; otherwise, you're at risk of search engines ignoring the sitemap like it isn't there.

Up-to-date. Make sure your sitemap is updated every time new content is added to your site (or removed from it) — this will help search engines discover new content fast.

Concise. Google won't crawl sitemaps over 50,000 URLs. Ideally, you should keep it much shorter than that to ensure that your most important pages are crawled more often: technical SEO experiments show that shorter sitemaps result in more effective crawls.

Registered in Search Console. Let Google know about your sitemap. You can either submit it manually to Google Search Console or specify its location anywhere in your robots.txt file in the following way:
Sitemap: http://yourdomain.com/sitemaplocation.xml

Step 11. Ask Google to re-crawl your site

The above site audit will surely spotlight a few issues on your site that need fixing. Once you fix them, you can explicitly ask Google to re-crawl your pages to make sure the changes are taken into account asap.

You can send your updated URLs for recrawl from Google Search Console with the help of the URL inspection tool. Enter the URL of the page you want to be re-crawled and click Request indexing. You can also Test Live URL (previously known Fetch as Google feature) to see your page in its current form, and then request indexing.

The URL inspection tool allows expanding the report for more details, testing live URL and requesting indexing.

You’ve got to assess when you really need to force recrawl. For example, after introducing serious changes onto the site: you moved your site from http to https, introduced structured data or did essential content optimization; or you want an urgent blog post to appear on Google quicker. There is believed to be a limit on the number of recrawl actions per month, so don’t abuse it. Recrawl can take from 10 minutes to several weeks. A better alternative to recrawl is to submit massive changes via the sitemap. Google doesn't guarantee to index all of your site's pages, but if the site is fairly small, it most probably will.

There's a similar option in http://www.bing.com/toolbox/webmasterBing Webmaster Tools, too. Just locate the Configure My Site section in your dashboard and click on Submit URLs. Fill in the URL you need re-indexed, and Bing will typically crawl it within minutes.The tool allows Webmasters to submit up to 10,000 URLs per day for most sites.

Step 12. Audit your site on a regular basis

Last, but not least. A regular site audit should be a priority in your SEO strategy. Whatever changes appear on the web, they might influence rankings in a most unpredictable way. That is why it is really wise to set SEO analyzer tools for regular technical checks to detect SEO problems and fix them as soon as they emerge. For example, you can run site audit automation with a professional version of Website Auditor: add a task to get a monthly SEO audit report with the criteria you need to control, like traffic, backlinks, etc.

In Website Auditor, you can customize the site audit report to get the data you need to monitor regularly (indexing, broken links, on-page, etc.)

What are your thoughts on the technical SEO as it is? Which tactics have you seen to be the most effective recently? Join the discussion and share your thoughts in the comments below.