Do you know what your website analytics are telling you?I mean really telling you?Website analytics are great for revealing things like pages with high bounce rates, top landing pages, and average session duration. However, they stop short of telling you why.Why is this the top organic landing page?Why isn't this page getting any traffic?Why are so many people leaving this page without navigating further into the site?The truth is, without context, we don't have much more than conjecture.
Do better than your best guess
If we want to know what's really behind our engagement metrics, we need to do better than guesses, opinions, and anecdotal evidence.But how?As SEOs, we know that a lot happens before searchers interact with our pages. Before a searcher can visit a page, they have to be able to find it. In order to be able to find it, Google has to have stored it in their index. Before Google can store something in its index, it has to crawl it (and render it, if it's loaded with JavaScript).In order to understand what's happening with our engagement metrics then, we have to look further back. We have to be able to look at our website analytics in conjunction with the metrics that explain the other steps in the process.Pairing log files with your analytics dataLog files are a treasure trove of information for SEOs, showing exactly how Google is crawling the website. If a page isn't getting any visits from Google/Organic, it may stem from a deeper issue -- maybe Google isn't even crawling the page. If Google hasn't crawled the page, it won't be indexed, and if it isn't indexed, it's not showing up in search results when people are looking for it, easily explaining a lack of organic search visits.Analyzing your log files in conjunction with your website analytics could reveal deeper issues that are cause for further exploration. For example, you may uncover crawl budget issues that are causing Google to ignore a good chunk of your important content, such as product pages.Pairing crawl data with your analytics dataA web crawler can also collect a vast amount of metadata about each of your pages, such as word count, title tag, H1, NoIndex tags, HTTP status codes, and more. If your pages have poor engagement metrics, like high bounce rate or low visits, the reason could be in this data.For example, by pairing data from a crawl of your website with data from your website analytics, you could find that pages with low session duration all have low word count, or that pages with high bounce rates don't have the primary keyword in the H1 tag.Pairing keyword data with your analytics dataDo you know what real searcher queries your website is getting impressions for? How about the average position you're ranking in or your visibility on mobile vs. desktop? The position your page occupies on the SERP and the relevance of your page to the searcher's query can have a huge impact on analytics data like traffic and bounce rate.Some of this will be obvious, like finding that the pages with the lowest organic search traffic occupy lower positions in the SERPs. However, some of the findings may be surprising. For example, you might find that some of your pages are getting low organic traffic despite top SERP positions. This is likely where you'd want to layering on even more data, such as title tag information from your site crawl.Stitching together log files, crawl data, keywords, and analyticsThe more context you add, the more correlations you find. That's why we recommend pairing all this data together so you can see the entirety of the search process, from crawling to conversions, giving you a much clearer picture of what's actually going on with your website.This is where it gets really interesting.For example, stitching all this information together could allow you to see that good bounce rate correlates with high SERP positions, and high SERP positions correlate with shallow page depth, and shallow page depth correlates with high crawl frequency. When it comes to organic search performance, everything is linked. Layering your data together allows you to see those links.
Don't just rely on your SEO assumptions
It's not uncommon for an SEO specialist to recommend website actions because "it's SEO best practice." While Google has given us many recommendations to follow, and countless industry studies have produced interesting correlations, it's important to remember that in the world of SEO, there really aren't universal best practices.Outside of the basics, like making sure you don't have a NoIndex tag on a page you want people to find in search results, you have to look at your own website data in order to make informed decisions about what actions are going to make the biggest impact. When we only follow conventional SEO wisdom, we may end up spending our time and our coworkers' time on changes that don't actually move the needle.For example, conventional SEO wisdom tells us that duplicate content is bad. If we never looked any further, then we may spend days, weeks, or even months trying to eradicate all the duplicate content on our website. By stitching together data from your own website, however, you might find that your inactive URLs (pages that haven't received an organic visit in the past 30 days) have no correlation with duplicate content.As tempting as it is, if we're going to make the biggest impact possible, we have to ditch "How are other people doing it?" with "What works for my website?"Manually stitching together all your data is possible, but it's a lot of work. If you'd like to see how Botify automates this so you can find correlations and SEO wins more quickly, book a demo with us! We'd love to show you around the platform.