Organic search is the foundation for driving website traffic. While it can be tempting to take it for granted, optimizing it can have a serious result on your revenue and improve the performance of your other channels. Search engine optimization (SEO) is a funnel and doesn't have to be complicated. Understanding it and focusing on the fundamentals can drive real business impact.
With the introduction of GenAI and Google's Search Generative Experience (SGE), the push for a cookieless world, and the rising cost of paid advertising, there is no better time to regain control of your SEO. This article will guide you through a few easy steps as you embark on your SEO journey.
When it comes to SEO, there are often two distinct camps:
- Content SEOs are focused on identifying new traffic opportunities and trends and developing the assets to capitalize on them. Their goal is to satisfy the consumer's search needs.
- Technical SEOs are focused on optimizing the technical health of a website, such as its crawlability and speed. Their goal is to make it easier for search engines to crawl a website and discover the content being produced.
The most creative and effective SEO teams can identify and act upon opportunities in both domains.
On the one hand, a healthy website can't serve its purpose if it does not have engaging content to crawl. On the other hand, if search engines aren't crawling your content, it will not be visible to searchers and won't garner any clicks, conversions, or revenue impact.
Yet, while technical SEO is just as important as content SEO, grasping its finer elements can be daunting. But it doesn't have to be! In this article, we'll discuss an easy way to pinpoint technical SEO issues and share five impactful techniques that marketers can use to improve their website's technical SEO health.
An Easy Way To Think About Technical SEO
As mentioned earlier, a website must be easy for search engines to crawl, and content must be well-targeted and engaging to drive serious traffic. This 5-step Botify SEO funnel will help SEO teams identify where to focus their efforts.
Step 1: Crawl
Before your content can rank, it must be found by search engine bots. Website content is discovered in two primary ways:
- Crawling the URLs listed in your XML sitemap(s).
- Crawling through links in your website's navigation and content.
Even if all your pages can be found through your XML sitemaps and/or internal linking, doing so for the entire web requires tremendous time and computing power that no search engine has. That's where the notion of crawl budget comes in.
According to Google, "The amount of time and resources that Google devotes to crawling a website is commonly called the website's crawl budget." While search engines would love to discover all of your great content, realistically, they need to be cautious of the time they spend on your website.
The longer it takes for search engines to discover your content, the more likely it is they won't reach some of your strategic content before calling it and moving to another website, effectively leaving these pages out of the search index. This can mean lower traffic, conversions, and revenue. Optimizing your XML sitemaps, improving internal linking, minimizing unnecessary parameters, and improving page speed are four easy steps you can already take to maximize the number of pages search engines crawl.
Step 2: Render
If search engines have to render a significant amount of JavaScript to see important content and links on your website, this can have a few negative SEO impacts, including:
- Lower Rankings: If JavaScript is required to render important content, search engines may not evaluate your page for that content in a timely fashion, leading to lower rankings than what you would otherwise have.
- Slow Page Discovery: If JavaScript is required to render internal links, search engines may not discover those pages quickly.
Step 3: Index
Up to this point, we've wanted to ensure maximum website crawlability. However, there are certain directives and signals that pages may have that will keep them from being included in Google's and Bing's index. These signals are typically in place to control for low-value and/or duplicate content, including:
- 3XX Redirects: If search engines discover a page that ends up redirecting to another page, search engines will typically only index the final page in the redirect. Therefore, an easy first step to improving the indexation of your website is to clean up redirect chains.
- 4XX Error Pages: Search engines will typically not index a page that serves a 4XX error, as they contain no helpful content for users.
- "NoIndex" Pages: The "NoIndex" tag tells search engines not to include this page in their index.
- Canonicalized Pages: If there are multiple duplicate/highly similar versions of the same URL (very common on e-commerce websites), website owners can employ the canonical tag to tell search engines which version to include in their index.
Step 4: Rank
The first 3 steps of the SEO funnel have been, under the domain of technical SEOs. Step 4 is where content SEO managers start to shine!
"Rank" is exactly what it sounds like. This is all about the indexable pages that are currently ranking and receiving impressions in the SERPs. If we see a concerning difference between pages being indexed and pages getting organic visits, this is where we can re-evaluate content for factors such as length, quality, freshness, demand, and keyword targeting.
Step 5: Convert
Once your content is ranking, it's important to understand how well it's doing for its desired outcome. Is it driving orders? New leads? More traffic or sign-ups? This is where we answer the question, "Is our content doing what we intended it to do?"
Tips to Improve Your Website's Technical SEO Health
If you've spent any time reading SEO blogs or talking with more technically focused SEOs, you know there are countless techniques, tactics, and tools available to address your website's technical SEO health. Today, we're going to focus on five approaches that, done at scale, can have a great impact on your website's overall crawlability.
- Optimize Your XML Sitemaps
As one of the main ways that search engines discover new content, your website's XML sitemap(s) needs to be as up-to-date as possible.
Your XML sitemap should adhere to the following guidelines:
- Pages to Include: Indexable HTML pages that respond with a 200/OK only. This includes the canonical version of your content.
- Pages NOT to Include: Non-indexable pages. This includes pages that redirect to other pages (3XX pages), error pages (4XX pages), pages with a "NoIndex" tag, and pages that contain a canonical tag pointing to a different page.
Additionally, if your XML sitemap contains more than 50,000 pages, break it up into multiple XML sitemap files and link to them all in an XML Sitemap Index file.
Finally, ensure you are linking to your XML sitemap(s) in your robots.txt file - this will make it easier for search engines to find the sitemaps. The final line of the robots.txt file should read as follows (change the URL to fit your website):
Sitemap: https://www.example.com/sitemap-location.xml
If you're already a Botify user, consider speaking to your Customer Success Manager about Botify's automated XML sitemap creation tool.
- Improve Internal Linking
Building internal links to and from important pages is the other primary way search engines discover your content, as it helps improve crawlability and, ultimately, organic performance.
- Prioritize Strategic Pages: If you have strategic pages -- say, category or product pages -- that are not being crawled regularly, ensure they are being linked to or from other relevant, strategic pages. When creating these internal links, diversify the anchor text you use -- this will help increase the number of long tail keywords these pages rank for.
- Avoid Orphan Pages: Orphan pages are pages that exist on your website but do not have any internal links pointing to them. Create a linking strategy to minimize orphan pages, especially if they are strategic.
- Broken Links: At scale, internal links that lead to a non-existent or 4XX error page waste valuable crawl budget (and user patience!). This can be fixed by ensuring these links point to correct, working URLs.
- Address Crawl Depth Issues: The further from the home page your strategic pages are, the more likely they are to have crawl issues. In many cases, these pages receive less organic traffic as well. These types of issues typically happen when there is an excess of low-value (often near-duplicate) pages for search engines to crawl. Crawl budget issues can be created by pages with a number of parameters, heavy use of filters and facets, long paginated link lists, and more.
If you are a Botify user, you can assess the full scope and impact of any internal linking issues by using the "Inlinks," "Search Engine," and "Distribution" modules.
As a Botify customer, you can quickly release internal linking optimizations (and more) -- without solely relying on your engineering team -- with our SEO optimization deployment solution.
- Clean Up Redirect Chains
Redirect chains -- when a URL redirects multiple times before reaching its final destination -- are commonplace on large websites that have been active for a while. It's not unheard of to see URLs that have to redirect 3 to 5 times on larger, legacy websites.
While it's perfectly fine to implement redirects, redirect chains like this can waste a search engine's crawl budget. Think about it: every time a search engine discovers a link with a redirect, it has to hit each page in the chain before getting to the final page it's supposed to crawl. Do this across hundreds of thousands of links, and you can start to waste a lot of valuable crawl budget.
When cleaning up your redirect files, look for any redirect with more than one "hop." Modify the redirect file so these URLs only redirect to the final destination URL.
- Improve Page Speed
Improving page speed is one of those SEO issues that takes up a lot of mental real estate for both technical and content SEOs.
The technical SEOs are focused on load time that slows down how quickly search engines can crawl and render all pages. The content SEOs are concerned with slow load time negatively affecting the user experience. Both are valid concerns and should be addressed.
Here are a few actionable ways to positively impact your page speed:
- Slow Server Response Time: The longer it takes for a server to respond to a page request from a user or search engine, the longer the overall page load process will take. We recommend a server response time of less than 500 ms. Work with your infrastructure team to ensure your web server responds quickly.
- Large File Size and Poorly Written Code: HTML, CSS, and JavaScript files are usually larger than necessary. This could be due to excessive white space in the code, unused or erroneous code being included, or otherwise inefficiently written code.
- Image Size: Images are typically a large portion of a page's size, taking longer for users and search engines to fully load the page. Optimize your images by compressing them, using web-friendly image formats, and/or serving images from a content delivery network (CDN).
If you're a Botify client, the "Performance" and "EngagementAnalytics" modules provide you with in-depth information about how your pages load for both search engines and consumers. Otherwise, the "Page Experience" report in your Google Search Console account, the Google Lighthouse tool, and Chrome DevTools will also provide thorough page speed and rendering information.
- Minimize Unnecessary Parameters
URL parameters are often used for analytics tracking, internal search, and managing different facets of e-commerce category pages. They are useful for internal analytics and providing an easy-to-understand user experience. However, they can also create a large duplicate content problem for search engines.
For example, if there is a canonical URL and the same URL with a parameter tacked onto the end, search engines see that as two different URLs to crawl but as the same content. If we scale this out to using multiple parameters and hundreds of thousands of pages, we're creating a serious crawl budget issue. We're effectively asking bots to crawl multiple versions of the same page!
Canonical URL: https://www.example.com/category-page-1
Parameter URL: https://www.example.com/category-page-1?price_range=100-200
The first step is to ensure your website is not using unnecessary parameters. It is quite common to see URLs with parameters that no longer serve a purpose. Work with your engineering team to eliminate these to help search engines crawl more unique and strategic pages.
Additionally, ensure that all parameter URLs have a canonical tag that points to the correct/canonical version of this page. This will help search engines to swiftly discover the correct version of your page to be crawled, as well as point link equity to the correct version of the page.
If parameter pages are taking up a large portion of your crawl budget, Botify can help. Contact us to discuss SpeedWorkers, our search engine bot experience manager.
Wrapping Up
When combined well, content SEO and technical SEO create a virtuous cycle of making it easy for search engines to easily discover all of your high-quality, engaging content.
While technical SEO can seem daunting, utilizing the Botify SEO funnel methodology can help you better understand where in the "Crawl - Render - Index - Rank - Convert" process you need the most attention to ensure maximum content visibility.
- Ready to take it to the next level? Make sure you track your way to success with these 10 Key SEO KPIs to measure.