Skip to main content

JavaScript web technologies are being used increasingly nowadays due to the level of functionality they offer, allowing brands to offer next-gen browser experiences to their users. However, because they go beyond the traditional method of serving content that we’ve been used to for all these years, there can be issues when it comes to how search engines view and understand it.

How do JavaScript websites differ?

The traditional method of serving content utilises a request-response format whereby a series of HTTP requests is sent by the client (e.g. browser, search engine) and responded to by the server. Each request usually operates independently of others, and this is evident when browsing a site and you navigate to a new page, the full page refreshes (a new request has been sent by the browser, and the server has responded with the new page).

With JavaScript technology, once the initial request has been sent, further content changes are only served when required, and the remaining content can stay in place as it was. This is achieved through AJAX (Asynchronous JavaScript and XML).

(source: https://www.seoclarity.net/blog/angularjs-seo)

What problems can arise?

Google has been open about not being able to correctly (or quickly) render all JavaScript technologies. Because many of the key elements of a page can be served via JavaScript, this means that any of these could be completely missed by Googlebot, impacting performance:

  • Title tags and meta descriptions
  • Main page content
  • Canonical tags
  • Meta robots tags
  • Links
  • Structured data

It’s important to be able to correctly crawl and analyse a website, as well as diagnose issues, when content is being served via JavaScript. Luckily there is a series of free tools that can help you understand whether Google is rendering your content as expected.

Tools for diagnosing issues

Google Search Console

The URL Inspection tool in Google Search Console should always be your first go-to when assessing how Googlebot sees a page. This can be accessed in the search bar at the top of any property page:

Entering a URL and then clicking “VIEW CRAWLED PAGE” shows you the fully rendered HTML of the page, after all JavaScript has been executed. The code you will see here is different to what you see when viewing the raw source code in the browser.

If any links or other key page elements aren’t present within the rendered HTML in Search Console, then it is unlikely that Google is finding and utilising them.

Chrome DevTools

The downside of Google Search Console is that you will need to have access to the site’s property in order to analyse it, which isn’t always possible. That’s where something like Chrome DevTools can come in handy. Whereas the raw source code of a page only displays the HTML that was present upon initial request-response, DevTools will execute JavaScript and show the full rendered HTML that it generates.

We can look at https://www.airbnb.co.uk/ as an example. According to the page source, it only contains 50 lines of code, however one of these lines is an enormous piece of JavaScript which will eventually generate lots of crucial HTML.

Let’s see how Google is reading this content. The homepage contains links to four locations under a section called “Inspiration for your next trip”:

But when viewing the page source, these don’t appear to be present:

However, when the JavaScript has been rendered and full HTML is visible in DevTools, the section can be found:

So, it’s highly likely that Google is finding and crawling these links, despite being generated by JavaScript.

Google Mobile-Friendly Test

This is another Google tool, but one that you can use on any webpage – even those that you don’t have access to via Search Console. It’s normally used to diagnose issues with mobile usability on a webpage, but handily it shows you the full rendered HTML along with a screenshot and any issues encountered when loading the page.

Entering any URL and clicking “VIEW TESTED PAGE” returns a pop-out from the right-hand side of the screen, with the ability to then search through the rendered HTML for key elements, as in Search Console:

We can see that in this instance, the Inspiration section has been detected as expected, and Google is able to crawl the links that sit within it.

Methods for resolving

If any of the above checks reveal that Google is having difficulty executing JavaScript and discovering content behind it, then there are some recommended approaches to resolve this. The primary approach recommended by Google is to utilise “Dynamic Rendering”, which involves serving a fully rendered HTML page to Google whilst continuing to offer the same JavaScript experience to users.

Source: https://developers.google.com/search/docs/advanced/javascript/fix-search-javascript

There are a few leading dynamic renderers suggested by Google, such as:

  • prerender.io
  • Puppeteer
  • Rendertron

Then follow any of the previously mentioned methods to verify how the content is being crawled and rendered!

Summary

If you have important, indexable content generated by JavaScript then you can use one of a handful of free tools to see how this is being rendered by Googlebot. If there are any issues with key elements such as links, canonical/meta robots tags or page content not being read then using a dynamic rendering service is a solution. This will enable you to serve a fully rendered HTML page to Google, whilst still allowing the same level of interactivity for users.

 

Find out our SEO services can help improve your organic performance here.