Skip to content
SealMetrics
Data Quality

Why GA4 Shows 13% of Your EU Traffic

8 min readBy Rafa Jimenez

Key Takeaways

  • GA4 captures approximately 13% of real EU traffic after three layers of data loss: consent rejection (55%), ad blockers (40%), and browser restrictions.
  • Even among the 45% who accept cookies, 65% accept on the second page view — after the landing page where the traffic source is captured. Only ~16% of visitors have correct attribution.
  • The cascade is multiplicative: 100 real visitors become ~45 after consent, ~27 after ad blockers, and ~13 after browser restrictions like Safari ITP.
  • Google Consent Mode v2 models missing data but cannot recover what was never collected — it estimates, not measures.
  • Cookieless analytics avoids all three layers by operating without cookies, third-party requests, or consent dependency.

Open GA4 right now and look at yesterday’s sessions. The number on your screen is not wrong, exactly. It is real data from real visitors. The problem is what it leaves out: roughly 87% of the people who actually visited your site.

This is not a bug. It is not a misconfiguration. It is the structural result of how cookie-based analytics works in the European Union in 2026. And understanding the math behind it is the first step toward fixing it.

Three layers of data loss

GA4 does not lose your data in one place. It loses it in three successive layers, each compounding on the last. The term for this cumulative erosion is data loss in analytics, and in the EU it follows a predictable cascade.

Start with 100 real visitors arriving at your site. By the time GA4 has processed them, you are left with approximately 13. Here is how each layer works.

Layer 1: Consent rejection removes 55%

Under GDPR and the ePrivacy Directive, any website using cookies for analytics must obtain explicit user consent before firing tracking scripts. The average consent rejection rate across EU markets is approximately 55%. In Germany, it regularly exceeds 65%. In the Netherlands, 60%.

When a visitor clicks “Reject” on your cookie banner, GA4 never loads. That visitor does not exist in your analytics. No pageview, no session, no event. They are invisible.

The deeper problem is that consent rejection is not random. Privacy-conscious users tend to be more tech-savvy, often have higher purchasing power, and are more likely to use premium devices. Losing them introduces a systematic bias into your data.

After this first layer: 100 visitors become approximately 45.

The hidden layer: cookies that arrive too late

There is a detail that makes the attribution problem even worse than the headline numbers suggest. Of the 45% of visitors who accept cookies, research shows that 65% accept starting from the second page view — not the first.

Why does this matter? Because the landing page is where the traffic source is recorded. The referrer URL, the UTM parameters, the campaign data — all of it is captured on the first page view. If cookies are not active on that page, the traffic source is never attributed.

The math: 45 visitors accept cookies. 65% of them (29 visitors) accept on page two or later. Only 35% of 45 — roughly 16 visitors out of every 100 — have cookies active on the landing page and therefore have their traffic source correctly attributed.

This means that even if you focus only on visitors who accept cookies, your attribution data is correct for just ~16% of total traffic. The rest are either invisible (55%) or visible but with unknown traffic origin (29%).

Layer 2: Ad blockers remove another 40%

Of the 45 visitors who accepted cookies, roughly 40% are running browser extensions that block analytics scripts. uBlock Origin, AdBlock Plus, Brave’s built-in shields, and dozens of similar tools all target gtag.js and the Google Analytics collection endpoint.

Unlike consent rejection, ad blocking is silent. The visitor accepted your cookie banner, they are browsing your site, they may even be converting — but GA4 never fires because the script was blocked before it could load.

Ad blocker adoption varies by market but continues to climb year-over-year. In technical audiences — software, SaaS, developer tools — the rate can exceed 60%.

After this second layer: 45 visitors become approximately 27.

Layer 3: Browser restrictions erode the rest

Safari’s Intelligent Tracking Prevention (ITP) caps third-party cookies at 7 days and client-side first-party cookies at 24 hours in many scenarios. Firefox’s Enhanced Tracking Protection (ETP) applies similar restrictions. These browsers together account for approximately 35% of EU web traffic.

The effect is subtle but significant: returning visitors appear as new visitors because their identifier expired. Sessions fragment. Attribution chains break. A customer who visited your site five times over two weeks looks like five different people in GA4.

This does not eliminate visitors from your count entirely, but it distorts session data, inflates new-user metrics, and destroys multi-session attribution. Combined with the visitors already lost to consent and ad blockers, the remaining accurate data drops from 27 to roughly 13 out of every 100 actual visitors.

The cascade

Real visitors100
After consent rejection (−55%)45
With correct page-1 attribution~16
After ad blockers (−40%)27
After browser restrictions~13 accurate

Approximate figures based on EU averages. Actual rates vary by market, audience, and consent banner design.

Why this is not solvable inside GA4

Google’s answer to consent-based data loss is Consent Mode v2. When a visitor rejects cookies, Consent Mode sends “cookieless pings” to Google, which then uses machine learning to model the missing data and fill in the gaps.

This sounds promising until you examine what it actually produces. Consent Mode does not measure the visitors who rejected cookies. It estimates what those visitors probably did based on the behavior of visitors who accepted. The resulting numbers are modeled data, not measurement.

Modeled data is acceptable for high-level trends. It is not acceptable for campaign-level attribution, conversion path analysis, or revenue decisions. When Google tells you that “estimated conversions” from a campaign are 47, that number is a statistical projection, not a count of real events.

And Consent Mode does nothing about ad blockers or browser restrictions. If gtag.js never loads, no ping is sent — modeled or otherwise.

What complete data looks like

The alternative is analytics that does not depend on cookies, does not load client-side scripts that can be blocked, and does not require consent for basic measurement.

SealMetrics uses a cookieless, server-side approach. A lightweight first-party script (under 1 KB) collects events through your own domain. No cookies are set. No third-party requests are made. Because the data collection method does not fall under cookie consent requirements, it captures 100% of traffic — including the 87% that GA4 misses.

You can see how the architecture works or calculate your own data loss based on your market and consent rates.

Is this unique to GA4?

To be fair: no. Any cookie-based analytics tool — Adobe Analytics, Piwik PRO in its default configuration, Matomo with cookies enabled — faces the same three-layer problem. GA4 is not uniquely bad. It is the most widely used tool that demonstrates a structural limitation shared by the entire category.

The detailed comparison between SealMetrics and GA4 covers pricing, data ownership, and compliance alongside data completeness. The 13% figure is the starting point, but it is not the only difference.