Your browser does not support JavaScript!
Home of the best Internet Marketing tools

JavaScript & SEO: Everything You Need To Know About JavaScript Rendering

By: Bartosz Goralewicz
June 19th, 2018

With JavaScript's seemingly infinite possibilities, ordinary websites based only on HTML and CSS are becoming a thing of the past. But while JavaScript allows for a dynamic experience for users, it's also creating a minefield for developers. Because of this, JavaScript SEO has become something impossible to ignore, particularly when it comes to making sure websites are properly rendered by Googlebot.

How Does JS Work?

For those who are unfamiliar with JavaScript, here's a quick introduction.

Alongside HTML and CSS, JavaScript is one of the three core web development technologies. HTML is used to create static pages (meaning the code is displayed in the browser as is, and cannot change based on the user's actions), whereas JavaScript makes a Web page dynamic. A programmer can use JavaScript to change the values and properties of an HTML tag as the user clicks a button or chooses a value from a drop-down box. JavaScript is placed on the page together with the HTML code, and it works in conjunction with the code.

Client-side and server-side rendering

There are two concepts we need to mention when talking about JavaScript. And it is very important to understand the differences between them: server-side rendering and client-side rendering.

Traditionally, as is the case with static HTML pages, the code is rendered on the server (server-side rendering). When visiting a page, Googlebot gets the complete and "ready" content, so it doesn't need to do anything but download the CSS files and show the information in your browser.

JavaScript, on the other hand, is generally run on the client machine (client-side rendering), meaning that Googlebot initially gets a page without any content, and then JavaScript creates the DOM (Document Object Model) that is used to load the rendered content. This happens every time a page is loaded.

Obviously, if Googlebot is unable to execute and render your JavaScript properly, it's unable to see the page and the content you wanted it to see. And this is where most of the problems with JavaScript SEO arise from.

Now let's look at how you can avoid these said problems.

How to Check if Your Website is Rendered Properly

Getting Googlebot to correctly render your site requires you to focus on properly rendering both the links and content on your website. If you don't render links, Googlebot will have difficulty finding its way around your site. And if you don't properly render the content on your website, Googlebot won't be able to see it.

Here are some options to help you out in this area.

1. The site: command

First of all, the site: command will show you how many pages of your website are currently indexed by Google. If many of them aren't in the index yet, it might be a sign of rendering problems with internal links.

Second, you might want to check if your JavaScript-loaded content itself is already indexed by Google.

To do that, you need to find a certain line of text that is not presented in your initial HTML code and is only loaded after JavaScript is executed. After that, search for this line of text in the Google index using the command.

Note that this won't work with the cache: command, since the cached versions of your pages contain only the original, not yet rendered code.

2. Chrome 41

In August of 2017, Google updated their Search Guides and announced that they are using Chrome 41 for rendering. This was a game changer for SEO, because you could now check how Google actually renders a page instead of guessing and hoping for the best.

Now you can just download Chrome 41 and check how a website or a webpage is rendered and seen by Googlebot.

3. Chrome DevTools

Some parts of your JavaScript code might be programmed to execute upon a certain user's action – a click, a scroll, etc. However, Googlebot is not a user. It's not going to click or scroll down, so it simply won't see the content you're loading upon these actions.

The quickest and easiest way to check if all the JavaScript-based elements are loaded without any user actions is by using Chrome DevTools:

  1. Open your website in Chrome
  2. Open the Elements tab in DevTools
  3. See how your page was rendered by looking through the actual page's DOM built by the browser – make sure all the crucial navigational and content elements are already present there.

I recommend checking it in Chrome 41. This will provide you with confidence that the element can technically be seen by Googlebot.

You can also, of course, check it in your current Chrome version and make some comparisons.

4. Google Search Console's Fetch and Render

Another useful tool, which can provide us with an idea of how Google renders our website, is the Fetch and Render function in Google Search Console.

First, you have to copy and paste your URL. Then click "Fetch and render" and wait a while. This will allow you to see if Googlebot can render your page and see any related articles, copy or links.

Here you can also use Google's Mobile-Friendly Test, which will show you JavaScript errors and the rendered page's code.

5. Server logs analysis

The last thing you can resort to in order to verify how Googlebot crawls your website is the server logs analysis. By taking a hard look at the server logs, you can check if specific URLs were visited by Googlebot and which sections were or were not crawled by Googlebot.

There are many elements you can analyze thanks to the server logs. For example, you can check if Googlebot visits your older articles — if not, there is probably a problem with the links, which could be caused by problems with JS.

You can also check if Googlebot sees every section of your website — if not, this could also be caused by rendering problems.

Server logs won't show you how Googlebot sees the pages. You can only check if the pages were visited and what response codes were sent. Additional analysis will be needed to determine whether or not any problems were caused by JavaScript.

Furthermore, by looking into the server logs, you can check if Googlebot requested your crucial JavaScript files, or completely ignored them.

Possible Problems in Rendering Your Website

Even if your page is rendered correctly in Fetch and Render in the Search Console, this does not mean you can sit back and relax. There are still other problems you need to keep an eye out for.

Let's start with one of the biggest problems you will have to deal with...

1. Timeouts

Although exact timeouts are not specified, it's said that Google can't wait for a script longer than 5 seconds. Our experiments at Elephate confirm this rule. If the script loads for more than 5 seconds, Google generally will not index the content it takes to generate.

Fetch and Render will show you if the page can be rendered by Google, but it will not include timeouts. It's important to remember that Fetch and Render is much more forgiving than a standard Googlebot, so you'll need to take the extra step to make sure that the scripts you are serving are able to load under 5 seconds.

2. Browser limitations

As I mentioned earlier, Google uses a somewhat outdated version of its browser to render websites: the three-year-old Chrome 41. And since JavaScript technology has evolved and is continuing to evolve very fast, some of its newest features that work in the latest versions of Chrome may not be supported in Chrome 41.

Therefore, the best solution is to download the Chrome 41 browser (the exact version that Google uses for web rendering) and familiarize yourself with it. Check the console log to see where any errors occur, and have your developers fix them.

3. Content requiring user interaction to load

I know I said this already, but it really needs repeating: Googlebot does not act like a user. Googlebot does not click on buttons, does not expand "read more", does not enter tabs, does not fill any forms... it only reads and follows.

This means that the entire content you want to be rendered should be loaded immediately to the DOM and not after the action has been performed.

This is particularly important with regard to "read more" content and menus, for example, in a hamburger menu.

What Can We Do to Help Googlebot Render Websites Better?

Googlebot's rendering of websites isn't a one-way street. There are plenty of things developers can do to facilitate the process, helping to both shine a spotlight on the things you want Googelbot to render and provide developers with a good night's sleep.

Avoid OnClick links

Search engines do not treat onclick="window.location=" as ordinary links, which means that in most cases they won't follow this type of navigation. And search engines most definitely won't treat them as internal link signals.

It is paramount that the links are in the DOM before the click is made. You can check this by opening the Developers Tools in Chrome 41, and check if the important links have already been loaded — without any extra user actions.

Avoid # in URLs

The # fragment identifier is not supported and ignored by Googlebot. So instead of using the URL structure, try to stick to the clean URL formatting —

Unique URLs per a unique piece of content

Each piece of your content must be located "somewhere" for a search engine to index it. This is why it's important to remember that if you dynamically change your content without changing a URL, you're preventing search engines from accessing it.

Avoid JS errors

HTML is very forgiving, but JavaScript is definitely not.

If your website contains errors in JS scripts, they will simply not be executed, which may lead to your website content not being displayed at all. One error in the script can cause a domino effect resulting in a lot of additional errors.

To check the code and keep your JavaScript free from errors, you can once again use the Chrome DevTools and review the Console tab to see what errors occurred and in which line of the JavaScript code.

Don't block JS in robots.txt

Blocking JS files is a very old practice, but it still happens quite often. It even sometimes occurs as a default in some CMSs. And while the goal is to optimize the crawl budget, blocking JS files (and CSS stylesheets) is considered a very bad practice. But don't take it from me, here's what Google has to say about the topic:

"We recommend making sure Googlebot can access any embedded resource that meaningfully contributes to your site's visible content or its layout..."

So don't do anything like this:


When you discover that Google has a problem rendering your JavaScript website, you can use prerendering.

Prerendering is providing a ready-made HTML snapshot of your website. This means that Googlebot does not receive JavaScript, but pure HTML. At the same time, the user who visits the site gets the same JavaScript-rich version of your page.

The most popular solution is to use external services for prerendering, like, which is compatible with all of the most important JS frameworks.

Using this solution is easy — you just need to add middleware or a snippet to your web server.


The topic of JavaScript SEO is the most dynamic topic in the SEO world and is definitely worth your attention, especially as it is developing so rapidly. The issues described in this article are only a small part of what you can investigate in order to make sure that Googlebot is correctly rendering your website. If you're interested in getting the bigger picture, check out Elephate's Ultimate Guide to JavaScript SEO.

By: Bartosz Goralewicz

Bartosz Goralewicz is the co-founder of Elephate, which was recently awarded "Best Small SEO Agency" at the 2018 European Search Awards. Elephate believes in paving new ways in the SEO industry through bold, precise SEO campaigns for clients, SEO experiments and case studies. Bartosz leads a highly specialized team of technical SEO experts who work on the deep technical optimization of large, international structures. Technical SEO is not only Bartosz's job, but one of his biggest passions, which is why he enjoys traveling around the world to share this enthusiasm with like-minded SEO folks.