Four Reasons Why You See Discrepancies Between Your Google Analytics Data & Programmatic Reporting

Ethic Advertising Agency’s Demand Side Platform (DSP) of choice for display, video pre-roll, and OTT/CTV advertising is Simpli.fi. The targeting, reporting, and capabilities are ideal for hyper-targeted programmatic advertising strategies; and, although we utilize various DSPs within our portfolio of services, we couldn’t ask for a better relationship simply based on their commitment to transparency, value, and quality and how these pillars align with our values. We’ve worked closely with their team to help us get to the bottom of a very complex question:  

 

Why Do There Appear to Be Discrepancies Between Google Analytics data and Simpli.fi (all DSP) data?

As advertising gurus, we have an obligation to push the boundaries and dive into some hard questions. One of the hardest questions we’ve faced from clients over the past few years is: Why do there appear to be discrepancies between Google Analytics data and Simpli.fi data?  As we navigated the possible reasons behind this concern, we found that Google Analytics (GA) does appear to pose reporting discrepancies when compared to Simpli.fi data, (and all  programmatic digital advertising reporting, for that matter).  So, we’ve been scouring articles, videos, consulting with experts, and running internal experiments on our own advertising campaigns to try to find a formulaic approach on how to cross-reference these two sets of data. 

Let us tell you upfront that we failed. There is not a simple, secret mathematical calculation to make the Google Analytics and programmatic advertising data match. Even though we couldn’t compress the answer into a mathematical equation like we wanted to, we sure did learn a lot about how to understand the data better, some simple data-cleaning tricks, and why this issue happens.  

Note: This article talks specifically about the Simpli.fi DSP, but a lot of this information is applicable to other DSPs, too. 

 

Why the Data Does Not Match:

  1. Geographic & Attribution Inaccuracies: Google Analytics is a free tool. Originally launched in 2005, Google Analytics is a great resource; but, it absolutely has flaws, and Google is the first to admit it (there are some great resource links at the bottom of this article about this topic and other relevant subjects).  What we consider to be the main flaw (regarding the context of this article) is that Google Analytics accounts for every user to the website.  Most would say, “Isn’t that a good thing?”  In our opinion: No.  This means that Google Analytics is also counting non-human users (i.e. bots and test clicks from ad campaigns, to name two).  Non-human users impact every Google Analytics data point and can result in skewed information.

 

In addition, Google Analytics often doesn’t actually know where users are coming from geographically and channel based, but because it counts every “user” its solution is to do an educated guess as to what location and channel to apply the user to. This means that users’ geographic location and channel of entry are often misplaced. For example, if GA knows that website traffic is coming from the United States but is unsure where in the U.S., it will often give Coffeyville, KS the geographic credit for the user simply because Coffeyville is the center of the country.  Other geographic inaccuracies also occur that cause users’ geographic location traffic data to be incorrect in Google Analytics.  Here is a great, short Forbes article on the subject.

 

Users can also be placed in an inaccurate attribution channel.  For example, a user of unknown attribution origin may get attributed to the programmatic advertising campaign when, really, he or she came through another channel.

 

Privacy regulations also make it very difficult to capture all data accurately within Google Analytics.  For instance, Safari and some apps make it extremely problematic to track activity due to their privacy-focused setups. Despite these privacy limitations, GA still records all on-site activity and then makes a best guess on where that traffic came from if it does not know. This leads to a lot of guessing and false Google Analytics data from advertising campaigns (considering that Safari is the #1 web browser for Apple products, and apps can easily account for more than 50% of all display advertising traffic).

 

Simpli.fi is on the other end of the spectrum when it comes to data reporting because they exclude all data attributed to test clicks, fraud prevention, or missing data. They believe in only trying to report on information they believe to be true human interactions. Clients are not charged for non-human impressions. In short, Google Analytics is not picky about the data it reports on, but Simpli.fi is much more conservative and will only report on data that holds more clout.

 

  1. Test Clicks: Simpli.fi removes test clicks from their data reporting; Google Analytics does not. Simpli.fi and their publishers implement test clicks to ensure quality and functionality of the campaigns on their platform. Test clicks are completed throughout the campaign’s life for quality assurance. 

Test clicks alone cause several discrepancies between the two sources of reporting…the obvious one being inflated user and new-user numbers reported in Google Analytics. In our most recent deep dive test (as of April, 2021), we concluded that 24.97% of GA users that were attributed to Simpli.fi campaigns were actually identified as test clicks, resulting in an inflated number of users reported in Google Analytics–because test clicks result in 0 seconds spent on a webpage. This additionally affects Google Analytics’ bounce rate and average session numbers in a false, negative manner. 

Let’s run the math on an average landing page session: Let’s say we have 75 real users on a webpage, averaging 45 seconds per session.  Now, let’s say Google Analytics reports 25 more users to the page (totaling 100 reported users in Google Analytics). But, these 25 additional users reported by Google Analytics are actually test clicks from programmatic ads that averaged 0 seconds on the page.  Google Analytics will tell you that you had 100 users (which is technically false) who spent an average of 33.75 seconds on the webpage (also false).

Additionally, user traffic from the ads preview screen within a campaign will also be counted in Google Analytics, but not in Simpli.fi reporting. This happens when a human ad op professional (i.e. an Ethic Advertising Agency team member) tests the ad from the DSP’s dashboard during campaign setup and optimization. These clicks tend to be very quick, which hurt the bounce rate and average-time-on-site data in Google Analytics, as well. These clicks also tend to be placed under “Referral” user-traffic in Google Analytics and titled something like: eastads.simpli.fi, westads.simpli.fi, or centralads.simpli.fi…which leads us to our first Google Analytics data scrubbing tip…   

Data Scrubbing Tip #1: If you are running ads on Simpli.fi and see “Direct” user traffic in Google Analytics from a UTM code with the word “scrub” in it—or, if you see “Referral” user traffic from something like eastads.simpli.fi, westads.simpli.fi, or centralads.simpli.fi–then those are either test clicks or ad preview clicks that occur during campaign setup and should be factored out of your Google Analytics data analysis.

Another important note about test clicks for Simpli.fi is that the number of test clicks greatly increases whenever new creative ads are introduced to a campaign. If you are running a lot of creative and/or changing it out more frequently, then your test clicks will be higher than a campaign with less creative elements and changes. This is because Simpli.fi, Publishers, and Ethic Advertising Agency all conduct extra tests clicks when new creative is applied to the campaign.  These test clicks ensure the quality and functionality of the campaigns, but they will result in even more false data reported in Google Analytics.

 

  1. Cookie Complexities: Google Analytics relies on cookies for data tracking. We’re not going to dive into how cookies and Google Analytics work with digital ads because efficiency is one of Ethic’s core values and we found this great article with images to do that for us. What we are going to say about this subject is this: A large portion of digital programmatic advertising (primarily banner ads) appears in apps–this includes someone’s weather app, games, news apps, etc. While apps are a fantastic advertising resource, they can be a nightmare for cookie-based tracking because apps are cookie-free environments. Despite these limitations, in-app traffic is recorded in Google Analytics no matter what, and GA makes an educated guess as to where the traffic is coming from.  

Additional issues (such as double-counting users, like we talked about earlier) can occur due to cookies, as well. If Google Analytics “cookies” a user from a UTM code and then that user comes back to the website, then they will be tied to the last known campaign and counted twice in Google Analytics. This leads us to our second Google Analytics data scrubbing tip…

Data Scrubbing Tip #2: While you have the option to set up your digital programmatic campaigns to exclude in-app inventory, we don’t recommend it. We tested this with a multi-targeted banner ad campaign to see if it helped clean up data discrepancies in Google Analytics, but it didn’t make a significant difference. It did, however, hinder the success of the campaign due to less inventory. The Cost Per Thousand (CPM) increased a tiny bit (0.7%), the Cost Per Click (CPC) increased dramatically (by 132.9%), and the Click Through Rate (CTR) decreased (by 56.8%). As we like to say at our agency, “the juice was not worth the squeeze” when we removed in-app inventory.

 

 

  1. Data Language Barriers: By nature, Google Analytics and Simpli.fi are speaking two different data languages.  Front-end DSPs, like Simpli.fi, focus primarily on performance regarding what is happening before a user gets to a website (such as impressions, clicks, view through rates, for example). You are able to track some conversion data on the DSP side as well, but all other DSP data focuses on the pre-website activity. Google Analytics’ data, on the other hand, primarily focuses on performance regarding what is happening when a user is  already on a website. A click doesn’t equal a user and vice versa…so you often cannot compare the two sources of data as apples-to-apples.

 

Why Ad Clicks Are Not the Most Important Metric of Success, Anyway

We hope this write-up serves as a helpful guide on how to read between the data lines. That said, when it comes to programmatic advertising, consider this: Most people don’t even click on the ads! But, that does not mean that the advertising didn’t work. How? Let’s put it into perspective:

The national banner ad CTR is .05% – .08% (self-promoting sidenote: Ethic Advertising Agency tends to be 2x to 4x higher than this average). Using national averages, this means that 99.92%-99.95% of the time that an ad is served, no one clicks on it.  So, if we serve one million impressions and 999,500 of those impressions did not result in a click, does that mean those 999,500 impressions had no value? Of course, it had value! Otherwise, any non-clickable format of advertising would be considered to have no value (i.e. TV, radio, print, out-of-home, transit, and even other digital options like OTT, digital audio, and more). It’s understandable that clients can get so caught up in the click data that they forget that the majority of the value happens outside of the direct clicks; but, remember: Having your message seen thousands or millions of times has loads of value!

You may say, “But clicks are all we have to base the success of the campaign on.” We say: Bologna! Straight up Oscar Mayer bologna! There are many fancy tracking tools out there, but we’re going to give you two key (and free) data points to look at when assessing programmatic digital advertising success.

  • Paid and organic search traffic to your website. It is normal for users who see any type of digital ad to not click on it and then do a Google search for the product or brand, instead. We tend to see a 15%-50% average increase in search traffic when we launch a digital programmatic campaign. Measuring and comparing the search (and direct) traffic tells a much bigger story.
  • Sales and lead trends (patterns and spikes). It’s going to be very rare that a customer says they saw your banner ad, video pre-roll ad, OTT ad, or digital audio ad. While this is nice info to have, we don’t recommend customer-polling as your only means of source tracking. What we really mean is to look for correlating spikes in website and store traffic, leads, and sales compared to the start of a digital (or any) advertising campaign, media shift, and/or creative change.  You need to compare this information over a longer period to determine meaningful patterns. If you track this macro-data and collaborate with your advertising partner on the information, a story of what is/isn’t working should emerge.

Additionally, while we always encourage clients to look at sales data as one key performance indicator, it’s important to note that closed sales are not always the best indicator of advertising success. Traffic and leads are truer metrics because advertising does not sell a product or service, but merely gives the company an opportunity to sell their product or service (whether that’s through their website, sales teams, or other means). We find that if leads are up and website traffic is up but sales are not, then there is something that is not working deeper within the sales funnel as opposed to the advertising.

 

Quick Case Study: Why Ad Clicks Aren’t the Most Important Metric of Success, Anyway

We have a new B2B client that we are running a geofencing display campaign for. They are super buttoned up and know their stuff, so the Google Analytics versus programmatic data was something they dove into right away. Here’s some of their campaign data, which further supports the notion that ad clicks aren’t the most important metric of success:

 

Essentially, they compared the prospects that weren’t being targeted with the geofencing display campaign versus the prospects that were being targeted with our display campaign and found that the prospects who were receiving our display ads were 4x more likely to turn into a lead. Here’s the kicker: They also saw a 60% increase in paid/organic search traffic to their website (+1,334 users) when comparing the duration of our campaign to the prior month. The comparison to the same period last year was even larger with a 123.3% increase in total search traffic (+1,965 users).  This is quite a large jump, which correlates directly to the timing of our display campaign. It’s also a very similar story to other campaigns we’ve run.

 

Ethic Advertising Agency’s sole purpose is to get the best results possible for our clients. With that in mind, we work tirelessly to research and onboard some of the best advertising and creative services around. Feel free to reach out to us by emailing info@ethic-ads.com if you have any questions about this article.

 

Additional Resources:

 

 

Google Support Links:

Troubleshoot Google Ads clicks vs. Analytics sessions: https://support.google.com/analytics/troubleshooter/7400792