I've been working closely with my colleague, Robin Eisenberg, VP of Engineering at Botify, on how Botify can help SEOs navigate in the age of JavaScript. Robin recently presented to a packed room at brightonSEO on what these new complexities mean for SEOs and how to succeed in this new world.Bottom line, and much to our chagrin, budgets exist. Most of us just don't have unlimited resources to spend on whatever we want.It's no different for search engines.Search engines don't have all the resources and time in the world to crawl everything every day. To solve for this, search engines asked the question, "How can we get the best picture of the web in the cheapest and fastest way possible?"The answer became crawl budget, which Google's Gary Illyes defines as "the number of URLs Googlebot can and wants to crawl."SEO professionals are pretty familiar with this concept by now. We know that we need to understand what search engines are and aren't seeing on our websites, and why.
Why do we need to care about budgets?
Budgets matter because they affect our bottom line.If search engines aren't finding content, the content won't be indexed, and content that the search engines don't index can't rank and earn traffic. When content isn't visited by searchers, they can't become customers.You could also look at this from a search engine's point of view. If Google and others can't find and index the valuable content on the web, their search engine will be less relevant, and ultimately less used.Although we're familiar with this concept already, the web today is more complex than ever before. Developers are increasingly building more interactivity and customization into websites, and they're using JavaScript to do it.This poses new challenges to SEOs because of how search engines handle JavaScript, meaning we need to take a second look at how we think about budgets.
The Rise of Render Budgets: How search engines handle JavaScript
In order for search engines to be able to see content loaded by JavaScript in the browser, they have to render it, and because rendering pages at the scale of the web requires a lot of time and computational resources, the search engine bots defer rendering JavaScript until they have the resources available. Google refers to this as a "second wave of indexing."At Botify, we are calling this Render Budget.This means that some details about your page might be missed. You may have even experienced the negative ramifications of this reality for yourself:If Google can render JavaScript and index that content, like it claims, then why does this happen?There are three main reasons for this.
1. The progression of JavaScript pages
Take a look at the image below. When it comes to an HTML page, Google is only dealing with the first and last image - it's either loading or loaded. When it comes to a JavaScript page, on the other hand, loading is a multi-phase process. How does Google know at which point in this progression to index the content?
2. Relative rendering
This also means that the time it takes for the content to be "ready" is relative. Is some content enough? Is all the content needed? This can differ from page to page, so Googlebot doesn't always know when it should stop rendering and index the page.
3. JavaScript debugging
It used to be fairly easy for SEO professionals to debug on-page issues - just view the source and read the HTML! But now, we have JavaScript, which is much harder to read, understand, and debug.The complexity of this new reality may leave you wanting to run for the hills, but as SEOs, we need to stick with it if we want to stay competitive in the future. After all, for every page of your website that is too hard to render, Google will render three of your competitor's pages.But how can we do that?
Analyzing your JavaScript
You can master JavaScript SEO without becoming a JavaScript expert! It just requires a slight mentality shift.Before developers were using JavaScript to construct websites, how search worked was fairly straightforward:
SEOs learned this well, and mastered the ability to optimize websites for crawling and indexing.Now that JavaScript has entered the scene, the search process looks a little different:But just as SEOs mastered HTML SEO without becoming HTML experts, so too can they master JavaScript SEO without needing to know how to become JavaScript developers.You already know the principles. They've just changed slightly to account for JavaScript.
SEO metrics before and after JavaScript
Being effective in this new environment requires a departure from the way SEOs are used to thinking.For example, we used to measure crawl rate on our websites by analyzing our log files and counting the number of search engines bot hits on our unique URLs.While this was valid in HTML-only days, in the JavaScript age, there are now multiple requests. We need to count unique referrers.We used to count the time it took for the server to send the page. This was a totally valid way of thinking when dealing with HTML-only pages that only had two states: loading or loaded.But now, in a JavaScript world, the browser has to do some work before the page is ready. We need to consider how much time it takes the crawler to fully render the page.And last but not least, we used to measure search engines bot activity by adding server-side SEO tracking. This worked fine in HTML-only days when there was only a single request on each page.Today it's similar, except the magic happens in the browser instead of on the server. We need to add crawler-side JavaScript SEO tracking.
So what is "render budget"?
If crawl budget is the number of URLs search engines bot can and wants to crawl, render budget is the number of pages search engines bot is able to execute (render). Think about it this way: if crawl budget is search engines opening the envelope, then render budget is search engines reading the letter.Like crawl budget, render budget acknowledges that search engines simply does not have time to render everything, every day.As a result, there are two important questions SEOs need to be able to answer:
- Is my important JavaScript content being seen by search engines?
- How much of my website is getting the Google "second pass"?
Is my important JavaScript content being seen by search engines?
SEOs should be vigilant to test their URLs to see if search engines have indexed important content. One of the most basic ways to do this is to use search engines' "site:" advanced search operator.Just input site:example.com "(snippet of that page's important content)" into the search bar.Does that content come up in the search engine's index? If it doesn't, it might be a good idea to dig in deeper and further assess JavaScript rendering for that website.
How much of my website is getting the Google "second pass"?
One way to find out how much of your content Google has rendered is to calculate your rendering ratio. This will reveal what percentage of your content Google is truly seeing.First, find an expression that you know is in the HTML-only version of your pages.Next, find an expression you know is in the rendered version of your pages. This text should not be present in the initial page response from the server, but rather only present after rendering.Using the same site: advanced search operator, search for the HTML-only phrase. This will show you how many total pages Google has indexed. Next, search for the HTML-only + rendered phrase. This will show you how many pages Google has rendered and run through its second phase of indexing.By dividing the number of rendered pages by the number of indexed pages, you'll arrive at your rendering ratio. In the example below, only about 80% of the pages are being rendered and going through the second phase of indexing.
Three practical steps to embracing JavaScript
These complexities certainly make SEO harder, but not impossible.As Google's John Mueller has said, JavaScript isn't going away. SEOs need to embrace it and learn to work with it rather than against it.So how can we do that?
Step 1: Work with your developers
Your development team is your best ally when it comes to grasping how your website behaves, so facilitate an open line of communication with them. Let them know about the SEO impact of their website changes, and be willing to listen to the issues they're trying to tackle as well.Many engineers aren't programmed to think about SEO, so come alongside them and solve these issues together.
Step 2: Add new targets
It's important to set appropriate targets in this new landscape. SEOs need to start measuring their success by metrics such as first contentful paint, time to interactive, and the size/number of requests.SEO has changed, so our measurements for success need to change with it.
Step 3: Optimize JavaScript at scale
The larger your website, the harder it gets to optimize for JavaScript unless you have a platform that helps you scale these activities. Botify was created precisely for this purpose.With Botify's JavaScript-inclusive crawl, you can do things like:
- View JavaScript rendered text on your website
- Follow JavaScript rendered links on your website
- Follow JavaScript redirects
The ability to measure and monitor JavaScript content is critical in today's landscape because it could mean the difference between your content being found or ignored by search engines. When revenue is on the line, that's something you can't leave up to chance.If you're interested in learning more about JavaScript SEO, we recommend resources such as:
When it comes to improving crawl and render budgets on your own websites, we recommend tools such as:
- Web.dev: Review page performance and learn how to improve.
- Lighthouse: Run on any page to audit for performance, accessibility, and more.
- PageSpeed Insights: Get recommendations that make your pages faster.
And if you want to see for yourself how Botify can help you optimize for render budget in today's JavaScript world, request a demo!