When you think of page speed, what comes to mind?
For most people, page speed conjures up images of trying to access a page, only to be met with a frustratingly blank screen that's loading… loading… loading… for more seconds than you're willing to wait.
That's certainly a huge component of page speed, but did you know that speed impacts your bot visitors as well as your human visitors?
Keep reading to learn what page speed really is and how it impacts both humans and bots, or jump to a specific section.
- What is page speed?
- Why does page speed matter for SEO?
- How page speed affects users
- How page speed affects bots
- How to know if your pages load slowly
What is page speed?
Page speed is a metric that measures how fast a page loads. Multiple factors impact how fast a page loads, such as server response time, large images, your content delivery network (CDN), and JavaScript, to name a few. User factors can also play a role in page load time. For example, a poor internet connection or certain devices.
Why does page speed matter for SEO?
Slow page speeds can have negative ramifications for both users and bots, both of which can affect your organic search rankings and traffic.
Let's dive into why that is.
How page speed affects users
Pages that load very slowly can cause users to get frustrated and leave your site, leading to higher bounce rates and lower conversions.
But what does that have to do with SEO?
According to Google's Martin Splitt, "You don't want to frustrate your users, and we as a search engine don't want to have users frustrated. So for us, it makes sense to consider fast websites as a little more helpful to users than very slow websites."
Because Google wants to provide a good experience to their users (searchers, AKA your potential website visitors), they consider speed as a factor in their ranking algorithms.
💡 When did Google announce speed as a ranking factor?
- April 9, 2010 (Desktop) "Today we're including a new signal in our search ranking algorithms: site speed."
- January 17, 2018 (Mobile) "Today we're announcing that starting in July 2018, page speed will be a ranking factor for mobile searches."
How important is page speed for rankings?
When it comes to algorithms, not all signals are created equal. Page speed, for example, is less important than the relevance of a page's content.
Hearing again from Martin Splitt, "If you have bad content but you're the fastest website out there, then that won't help you."
In other words, what good is a fast page if it doesn't contain what the user is looking for?
Google's John Mueller puts it this way, "We try to differentiate between sites that are significantly slow and sites within a normal range. When we're looking at things that are really, really slow… that's where algorithms might take action as far as how they show it in the search results. If you're within the reasonable range...a couple of seconds or even half a minute, tweaking that isn't going to have a direct effect on your rankings."
Google also recently announced that they would be improving the way they factor page experience into rankings by creating a new signal that combines page experience factors like mobile-friendliness with Core Web Vitals metrics (Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift).
💡 Related Resource: Google's Page Experience Ranking Factor: How To Improve Web Vitals For Better SEO & UX
How page speed affects bots
Although we typically tend to focus on the impact slow page speeds have on real human visitors, and how Google factors that into their ranking algorithm, there's another side to the page speed coin.
Page speed impacts search engine bots too.
Just like human visitors, bots make requests to view your pages. This process is known as crawling, and it's a necessary step if you want your content to get added to the search engine's index where it can be found and clicked on by searchers.
Learn more in "What Is The SEO Funnel?"
But search engines have limited time and resources. They can't crawl all the billions of pages on the web all the time. That's why they give each site a crawl budget, which is the amount of time they can and will spend on your site in a given session. (Important note: crawl budget typically isn't a problem for smaller sites).
Google determines your crawl budget based on factors like crawl rate limit and crawl demand. Learn more about crawl budget.
Because there's a limit on how much time Googlebot will spend on your site, page load times can impact your crawl budget. This means that new pages may not be discovered and existing pages may not be updated frequently enough to keep up with the pace of actual page changes.
We've run tests on this before, and can definitely confirm that page load times impact how Google crawls your site. In the example below, you can see that page load time has a drastic impact crawl ratio, particularly for sites with more than 10,000 pages.
Charts depict crawl ratio vs. load times in milliseconds. "Not crawled" pages are in red, while "crawled" pages are in blue. This data comes from an analysis of 413 million pages and 6 billion Googlebot requests. Read more: Do Slow Page Load Times Negatively Impact How Google Crawls Your Site?
💡 Page Speed & The Mobile-First Index: Based on our research, we've also seen a positive correlation between page speed and the mobile-first index (MFI). Google has been transitioning very slow sites to the MFI at a lower rate than faster sites. You can read more on our MFI research here.
One of the biggest culprits in this department is JavaScript, which can add seconds of load time to your pages and has only gotten more popular over the years.
According to Martin Splitt, "There's so many different factors [that cause pages to load slowly]. Sometimes it's your servers, or sometimes your servers respond really quickly but there's a ton of JavaScript that has to be processed first. JavaScript is a very expensive resource because it has to be fully downloaded, parsed, and then executed."
How does JavaScript affect bots?
Bots and users have different needs.
For example, JavaScript can add interactivity and functionality that makes for a great page experience. However, since search engine bots don't interact with pages, that JavaScript just adds unnecessary load time.
JavaScript can also add important tags to your site, such as advertising iframes, analytics & metrics scripts, and A/B testing scripts, which typically don't add content you'd need the search engine bot to see.
Removing those things might make pages faster for bots, but could compromise the human experience. So, how do we reconcile the two?
Dynamic rendering: a unique experience for both humans and bots
One Google-approved solution to this problem is dynamic rendering.
Dynamic rendering sends fully-rendered content to search engines while serving human visitors with normal, client-side rendered content. It's pre-rendering for search engine bots, and Google likes it because they get the same content that you're sending to human visitors -- just in a format that's easier and faster for them to view! In fact, when done well, Googlebot wouldn't even be able to detect that you're dynamic rendering.
An example of how a page with JavaScript would look to a bot without prerendering vs. how the bot would see it if the page prerendered the JavaScript.
Bing is also a fan of dynamic rendering. In their recently-updated webmaster guidelines, they say that dynamic rendering can help you prevent being negatively impacted by the search engine's limitations in processing JavaScript at scale.
💡 Read more: What Is Dynamic Rendering & How Does It Impact SEO?
But does it work for preventing speed-related crawl budget issues? Based on real data from our SpeedWorkers customers, yes!
By taking the burden of rendering JavaScript off the bot, our customers have been enjoying 10-30x improvements in speed. We've seen even higher though -- for example, pages that were taking 10+ seconds to fully render, after SpeedWorkers, served in 300-400ms.
The end result? More unique pages crawled, pages refreshed in the index more often, new pages discovered sooner, and increases in organic traffic and revenue.
One example of the downstream benefits of dynamic rendering. More pages crawled resulted in more ranking keywords, more active URLs, and more overall clicks.
How to know if your pages load slowly
There are lots of tests on the market that help you evaluate your page's performance.
Tools like Lighthouse and PageSpeed Insights measure lab data (not what actual users see) as well as field (real world) data -- both metrics are used by search, but Google has clarified "We're not using the Lighthouse score for ranking. We're bucketing sites into ones that are really problematic, ones that are OK, and fast ones. You can see that in the speed report as well in Google Search Console."
Botify's performance reports can also show you load time distribution across your entire site.
Learn more about Botify's performance reports.
You can use your Google Analytics and see what devices your visitors are using, and what the load time is for those devices. You may even want to consider identifying the most common device your visitors use and regularly testing your site on that device.
💡 Related Resource: Find out how you stack up to new industry benchmarks for mobile page speed -- Think With Google
SEO is all about making both humans & bots happy
In the end, search engine optimization is all about providing the best experience for both your human visitors and search engine bots.
By being mindful of your page load times, and taking measures like dynamic rendering to make it faster for bots to view your pages, you'll be in good shape for SEO success.