SEO Culture
Publisher SEO
Business & Reporting
Botify News & Products
SEO Foundations
SEO News & Events
Future of SEO
E-Commerce SEO
Enterprise SEO
Content SEO
Technical SEO

Back to blog

Technical SEO

JavaScript 101 For SEOs

X
 min read
September 10, 2019
The Botify Team

In recent years, SEOs have been hearing more and more about JavaScript and its impact on search performance, leading many SEOs to wonder if they needed to become JavaScript developers to do their jobs effectively.

Thankfully, that's not the case.

While SEOs may not need to learn how to code in JavaScript, there are benefits to understanding how it's used and when it could have an impact on search performance.

Why are so many SEOs talking about JavaScript these days?

Although JavaScript has been around for a while, it's begun to come up with increasing frequency in SEO circles - why is that?

JavaScript has become the popular choice for coding websites for a few reasons:

  • Product Teams prefer JavaScript because it delivers the most interactive product
  • Developers prefer JavaScript because it is much easier to use than HTML for large and complex websites
  • Infrastructure and Finance prefer JavaScript because it is cheaper

So while some JavaScript can make an SEOs life more difficult, it's usually implemented with good intentions (i.e. increased interactivity, saves time, saves money).

Is all JavaScript bad for SEO?

Definitely not! JavaScript can be used in a number of different ways, so the impact it will have on search will differ depending on exactly how it's used.

Here are the three main types of JavaScript websites, and whether they affect SEO:

  • Partial JavaScript (JS-generated elements don't matter for SEO) These sites use JavaScript to load elements that don't impact SEO. For sites like these, an HTML-only analysis should be all that's needed for checking the structure and content of the website. That being said, it's still worth considering whether the JavaScript elements on these sites negatively impact Google's crawl.
  • Partial JavaScript (JS-generated elements do matter for SEO) These sites use JavaScript to load SEO-critical elements like links and content. It's a good idea to compare the HTML-only and JavaScript-rendered versions of each page and note the differences in things like linking and content quality. Like any site with JavaScript, it's also a good idea to analyze overall JavaScript performance.
  • Full JavaScript (Entire site is generated with JavaScript) SEOs managing sites that are built completely in JavaScript should analyze the impact of JavaScript on site performance. For sites that use a prerendering solution, check the server-side versus the client-side rendering.

If JavaScript can be rendered by Google, why does it still cause issues?

In short, rendering delays.

Google has two waves of indexing. The first wave of indexing is instant because Google is only looking at the page's HTML. Google will then come back for a second wave of indexing where they'll render your JavaScript, but this could be a few days or even a few weeks later.

In a recent Google webmaster hangout, Martin Splitt said that, while the two-waved indexing is still a reality, crawling, rendering, and indexing will occur closer together in the future. He went on to explain that rendering JavaScript is cheaper than Google initially thought, and therefore rendering delays would play less of a role in the future.

For now though, Google still continues to index JavaScript websites in two phases, which means that they may initially miss any content or links that you're loading with JavaScript. If you have pages where the content is updated frequently, it means that by the time Google sees the content on your page, it may have changed already! This is an especially important consideration for publishers who are posting timely content such as breaking news, or e-commerce websites that have a constantly changing inventory.

It's also important to remember that, while Google holds the majority of search engine market share, other major search engines like Bing have a much harder time rendering JavaScript.

Client-side, server-side, pre-rendering, and dynamic rendering: a crash course

There are four main ways to render the content of a website: client-size, server-side, pre-rendering, and dynamic rendering.

Client-side rendering: the least SEO-friendly option

Client-side rendering is the least SEO-friendly of all the rendering options, but when done properly, it can have some advantages when it comes to user experience.

Client-side rendering means all the burden of rendering the content is on the "client" - that means you! Instead of the page being assembled at the server and then sent to your browser, the page is sent to your browser disassembled, leaving your browser to do all the work of loading the content. This means that your browser has to make multiple round trips to a server to grab and then load all of the content on the page.

Client-side rendered content is subject to Google's "rendering budget," which means there will be a delay before Google accesses it, and nearly inaccessible by other search engines, but many businesses (particularly those people dealing with infrastructure and finance) prefer this option because it reduces the load on their own servers.

Server-side rendering: the SEO-preferred option

Most SEOs prefer server-side rendering because all the meaningful content gets rendered on the website's server, which means it's not subject to the two waves of indexing. Both users and bots will receive a fully-loaded page, without the need to request additional resources.

Server-side rendering takes the burden of rendering a page off of you (browser) and places it on the website's server.

Pre-rendering: a workaround option

Pre-rendering can be a viable option for some websites, but it's somewhat of a workaround. It works like this:

  • Website receives a request (or a specified period of time elapses)
  • A service renders the page
  • Only search engines receive the pre-rendered version of the page

Because pre-rendering is a solution for search engines, there's no real benefit for users. Third-party pre-rendering solutions like prerender.io can also be expensive, and they're prone to bugs and can break on occasion.

Pre-rendering also may not be a great solution for websites with pages that change often. For example, we worked with an e-commerce business whose prerender solution was causing product prices to get out of sync -- Google was seeing one price while users were seeing another. It works best when used on sites that don't often change or are completely static.

Dynamic rendering: treating users and bots differently

Dynamic rendering is a method that uses a different rendering process depending on who's trying to access the page. Search engine crawlers will receive a full server-rendered HTML, while humans will receive the initial HTML and make all the additional requests on their end.

While this solution helps search engines see your fully-loaded page instantly, it still places the entire burden of loading the page on the user.

What does it mean to "render" a page?

If you've gotten to this point and are still unclear on what rendering means, that's understandable! Rendering is a somewhat foreign concept to non-developers, but easy enough to understand once you break it down.

Rendering is the process your phone, computer, tablet, or other device's browser has to go through in order to actually "build" a web page.

Most of the time, this requires your computer to go out and get hundreds of different resources in order to make the page work the way the site intended it to.

The "rendering" process can take a long time, depending on the size and quantity of those different resources your browser has to go and fetch.

This comes at a cost to you (battery, speed, data, etc.) as well as search engines.

How to crawl your JavaScript for SEO analysis

Remember, not all JavaScript websites are created equal. In a lot of cases, websites that use JavaScript have plenty of resources that are not necessarily used to render content or links.

When it comes to crawling your site for SEO analysis, there are certain resources you'd want to ignore, such as:

  • CSS - Stands for "cascading style sheets." CSS is used to style elements of your website, such as colors and fonts.
  • Tracking tags - These are used for tracking users and monetization purposes. Google Tag Manager and Google Ads conversion tracking tags are good examples of this.
  • JavaScript enhancements - This type of JavaScript is used to make things look nice and smooth as you navigate the site, but does not change the content or links.

When you're crawling a site for the purpose of SEO analysis, you'll primarily be interested in simulating a search engine's experience. That's why it's a good idea to configure your crawler so that it's looking at content (text) and internal links to other pages.

Configuring a JavaScript SEO crawl

At Botify, we often work with customers to decide which (if any) of their website's resources that load on a page are needed to do a proper SEO analysis. By eliminating any unnecessary resources, we ensure we're only looking at the elements that matter for SEO, and not wasting time crawling things that aren't important for your analysis.

For example, take a look at a site before and after we performed a JavaScript crawl configuration. It went from 354 resources down to just 13 resources that were critical for SEO evaluation!

How does a JavaScript crawl for SEO work?

Botify can crawl your full website, including JavaScript, with JavaScript Crawl!

What does JavaScript Crawl do?

  • Execute all JavaScript code (both internal and external)
  • Allow you to select which JavaScript resources you want to execute or ignore (as outlined above)
  • Enable caching of external resources
  • Comply with your website's robots.txt instructions, but these can also be replaced with other instructions that you specify
  • Check load times

Why is this functionality important? Because it allows you to identify, at scale, the most common SEO issues caused by JavaScript.

Because JavaScript Crawl can do all these things, as a result, Botify will:

  • Crawl links (< a > tags) that are inserted into your site using JavaScript
  • Find tags that have been inserted into your site using JavaScript
  • Show you what page content has been inserted using JavaScript
  • Reveal how JavaScript resources are impacting performance across key page segments

Can't get enough of JavaScript SEO? We don't blame you! Check out some of our other article on the topic:

Want to learn more? Connect with our team for a Botify demo!
Get in touch
Related articles
No items to show
Join our newsletter
SEO moves fast. Stay up-to-date with a monthly digest of the industry's best educational content, news and hot takes.