Why Your Filtered Google Search Console Data Doesn’t Add Up – JumpFly PPC Advertising News

Google Search Console is a fantastic tool that allows you to analyze different parts of first-party data that Google Analytics doesn’t provide. While you have to use both Analytics and Search Console in tandem to make informed, data-driven decisions about how you optimize your website’s content, you should always pay attention to Google Search Console’s data outputs.
You may notice when you start playing around with filters in the Performance Report, your data will change and not always make sense. By looking at the different ways to filter in Search Console, examining the data discrepancies you can encounter, and summarizing how filtering Google Search Console data can affect the outputs, you won’t have to second guess the decisions you make with your data. 
Google Search Console provides unique data points at the page level that you can use to gauge the performance of an individual page, group of pages, or property.  When you go to the Performance Report within Search Console, the points that are plotted on the graph include:
Simply click the data point you want to see graphically, and it will show up on the graph.
Adding a dimension to the data points you wish to filter can make your data look much different. Underneath your graph, you can see the following dimensions listed: 
Lastly, you can analyze data by how people are searching on the web. These filters include:
After looking at all of the different ways to filter, you might be thinking that you have almost too much data to parse through. And once you start combining different filters together, the data can get messy quickly. As you begin to analyze filtered data, you’ll notice things don’t always add up.
The main thing to keep in mind with Search Console is that Google is all about getting as much accurate data to you as quickly as possible. To get the information quickly, Google needs to take a few shortcuts. 
There are two main reasons for this data discrepancy. The first is that Google at times provides incomplete data, which manifests itself with the Search Console’s data limits, reporting lag, and anonymized queries. The second reason is based on how Search Console aggregates its data, which can result in mismatched data, data that doesn’t add up, and double counting data after filters are applied.
Google Search Console simply has too much data to go through, and it simply can’t provide everything all at once. This incomplete data shows itself in a few ways:
The most glaring issue that marketers run into when exporting query data is that Google Search Console only provides the first 1,000 data points. There are instances where you will have more data than Search Console can allow access to, and it prevents you from analyzing information to its fullest. If you connect to the Search Console API, you’ll be able to export more, but otherwise, you have to stick with the first 1,000 rows of data. 
Also, if you look at the queries report, you’ll notice that there are specific long-tail keywords that do not show up on certain pages, even if you search that query and your page shows up in the SERPs. Google reports that 15% of the nearly 8.5 billion searches on its site every day are unique, so it simply doesn’t have enough bandwidth to properly report on a search that may only get one or two searches over time that leads to a click.
There is usually a two-day lag for the data to appear in Google Search Console. For monthly reporting, you typically need to wait until the 3rd of the month for full month-over-month keyword analysis, and it makes performance analysis in real time impossible. 
More incomplete data comes in the form of anonymized queries. There are some searches that, for privacy reasons, get left out of Search Console reports to protect the searcher. Anytime a query filter is applied, the anonymized queries will fall out of the filtered report, even though the data will still be included in the total data set.
The second discrepancy comes in the form of data aggregation. Data aggregation, or how the data is summarized, happens in two different ways in Google Search Console. Data can either be aggregated at the property (or domain) level or at the page level.
When you initially have a data set without any filters applied, the data by default is aggregated by property. Since Search Console prioritizes the speed at which it can provide you data, it is giving you what it considers to be the most important data set. Because of this, sometimes it underreports all of the click and impression data.
In order to filter the full data set quickly and efficiently, Google uses bloom filters as it toggles between property and page-level data. Bloom filters use probabilities from encoded data sets to increase the speed at which they can provide the data. However, this process can result in some discrepancies (both increases and decreases) and can sometimes leave you with more questions than answers.
This is common because of how bloom filters work. When the filtering changes the aggregation from property to page, the total data can go up because it is getting more data points and trying to provide more accurate information. Ultimately, if you’re looking to analyze the page-level data, this is more helpful, albeit confusing, initially — particularly when you see an increase after you think you are removing data points. Conversely, you can see your results decrease when unfiltering, or when you are analyzing a page and going back to the main data set (from page aggregation to property aggregation).
When you apply filters to include and exclude the same data point (either pages or queries), you may notice that sometimes the data doesn’t add up. For example, you could filter a query out of a page’s data to get a figure of how many clicks there are without that query. If you then add that query back into the full set and filter out every other query, you should be able to match the total clicks on that page by adding those two numbers together. However, this doesn’t always happen. This is most attributed to anonymized data as well as how many rows of data Google can provide.  
With all of these quirks in how Search Console provides us data, you might think it’s not a reliable tool to use when making decisions on how to optimize pages. However, it’s important to realize that there are always factors in how the data is gathered that impact what your data looks like, no matter which tool you’re using. 
Despite some of the reporting headaches, it’s still worth using Search Console data to make decisions. Because the data comes directly from Google itself, it’s much more trustworthy than other 3rd party ranking tools. If you’re going to make data-driven decisions about Google, it’s vital to get the information from the source.
Search Console also makes analyzing why people are arriving at your pages much more thorough. GA4 can show you the organic sessions by landing page, but being able to dissect what keywords drive the traffic via Search Console is a huge plus. If your content is not performing well for certain keywords, Search Console can identify those to help guide you as you optimize your pages. If a page does perform well for its targeted keyword, you can analyze ways to improve the page or find content gaps to improve upon the page even further.
While the data may not always make sense and may not always be complete, Google Search Console is an essential tool to use to make informed decisions in organic search. By understanding how Search Console provides data, you can couple your insights with other data sources in order to get the full picture of how to best optimize your content.
Get the latest on the world of digital marketing right to your inbox.

    Let’s Work Together
    Copyright © JumpFly, Inc., All Rights Reserved
    Privacy Policy | Terms of Service | Top ▲
    Get Our Marketing Insights Right To Your Inbox

      Get Our Marketing Insights Right To Your Inbox





          Fields containing a star (*) are required



          Fields containing a star (*) are required

          Content from Calendly will be embedded here





            source

            Related Posts

            Google Strengthens Search Console Security With Token Removal Tools – Search Engine Journal

            Stay ahead of the game with the best marketing tools and ensure your tactics are primed for success in this new era of digital marketing.This webinar will equip you with…

            Read more

            Google Search Console security update improves management of ownership tokens – Search Engine Land

            sel logoSearch Engine Land » SEO » Google Search Console security update improves management of ownership tokensChat with SearchBot Please note that your conversations will be recorded. SearchBot: I am…

            Read more

            Search Engine Optimization (SEO) Market Size Worth USD 157.41 Billion in 2032 | Emergen Research – Yahoo Finance

            Search Engine Optimization (SEO) Market Size Worth USD 157.41 Billion in 2032 | Emergen Research  Yahoo Financesource

            Read more

            AI Prompt Engineering Tips for SEO – JumpFly PPC Advertising News

            AI Prompt Engineering Tips for SEO  JumpFly PPC Advertising Newssource

            Read more

            Most Common B2B SaaS SEO Mistakes – MarketingProfs.com

            by Ryan Lingenfelser Many B2B SaaS companies ignore SEO… and they are often right to do so!For SMBs, especially startups, it rarely makes sense to prioritize SEO. Compared with marketing…

            Read more

            How To Create an XML Sitemap To Improve Your Website’s SEO (2023) – Shopify

            Start your businessBuild your brandCreate your websiteOnline store editorCustomize your storeStore themesFind business appsShopify app storeOwn your site domainDomains & hostingExplore free business toolsTools to run your businessSell your productsSell…

            Read more

            Leave a Reply

            Your email address will not be published. Required fields are marked *