Most ecommerce teams use GA4 to answer three questions: How many sessions did we get? What was the conversion rate? How much revenue did we make? That’s fine. But it’s roughly 10% of what GA4 can tell you.
The real power of GA4’s event model isn’t in the headline KPIs. It’s in the behavioural layer underneath them — the micro-interactions that reveal exactly why customers are or aren’t converting, often months before that shows up in revenue numbers.
I recently completed a deep-dive GA4 audit for a major UK baby and nursery retailer.
What we found changed the entire framing of their performance decline — and surfaced a set of product and UX fixes that no amount of revenue dashboarding would have found.
Here’s what we learned, and why it matters for any ecommerce team thinking seriously about GA4.
The Gap Between Sessions and Story
When you look at a site showing sessions up 3.9% but revenue down 2.6%, the instinct is to say “conversion problem” and brief the UX team. That’s not wrong — but it’s not the full picture. GA4 event data lets you decompose that conversion problem into specific, actionable failure points. Instead of “conversion is down,” you can say:
Customers are reaching product pages but not clicking on anything (-77.7% PDP interactions year on year)
Customers are adding to cart in record numbers but abandoning before delivery information (+268% view_cart events with declining purchases)
Customers are reading reviews three times more than last year because there aren’t enough reviews to satisfy them before buying (+277% readReviews events)
The checkout is actively failing at a rate 12× higher than the previous year (+1,240% orderFailed events)
Each of those is a different fix. Aggregated into “conversion rate,” they’re invisible.
What Behavioural Events Actually Tell You
The beauty of GA4’s event model is that every interaction can be tracked: every filter applied, every review opened, every video watched, every wishlist add, every failed payment. When you analyse these in aggregate, you stop guessing at customer intent and start reading it directly.
Here’s what five specific event patterns told us, and what action each one implied:
Event 1 – filterProducts +237% and sortProducts +244%
When customers are overriding the default PLP sort and applying filters at dramatically higher rates, they’re telling you the default experience isn’t showing them what they want. This isn’t a minor UX preference — it’s evidence of a discovery failure at scale. In this case, it pointed directly to a misconfigured search layer (Algolia was installed but not set up with synonym groups, query rules, or a furniture-relevant default sort). The fix was configuration, not development. The event data made it visible.
Event 2 – readReviews +277%
This is one of the most diagnostic events you can track. When customers are reading reviews at nearly three times the rate of the previous year, it means purchase confidence is lower — they need more social proof before they’ll commit. In the furniture category we were auditing, 86% of SKUs had zero Feefo reviews. One nursery set, with 96 reviews, was the best-converting furniture SKU on the site. The correlation between review count and conversion rate was direct and measurable. Without the readReviews event, this would have been invisible.
Event 3 – video_complete +399%
Customers who found video content on product pages watched it to completion at a rate nearly five times higher than the year before. The catch: almost no furniture PDPs had video content. This event data is simultaneously a signal of unmet demand and a roadmap item — brief the content team to produce nursery lifestyle video for the top ten hero SKUs. The audience is already there, waiting.
This combination — more wishlisting, much more un-wishlisting — is the behavioural signature of deferral. Customers are saving products, returning to reconsider, and removing them without purchasing. This is classic high-AOV purchase behaviour: the decision cycle is long, the consideration is real, but something is preventing commitment at the final step. For a nursery furniture retailer, this pattern points directly to the need for guided selling tools and nurture email flows that support the consideration journey rather than treating every session as a one-shot conversion opportunity.
Event 5 – orderFailed: +1,240% (184 events in 2025, 2,465 in 2026)
This one is the most urgent finding in any event audit. A twelvefold increase in checkout failure events isn’t a UX problem — it’s a systemic technical failure that was introduced at some point during 2026 and went undetected because it doesn’t show up in the standard conversion rate dashboard in a way that screams “emergency.” The revenue being lost to checkout failures every single day can be estimated directly from the orderFailed event count multiplied by average order value. Without this event, you’d know sales were soft. With it, you know exactly where they’re being lost.
The Tracking Gap That Was Costing More Than Any UX Problem
Buried in the audit was a finding that reframed the entire analysis.
The key event was firing on page load — meaning every session registered as a key event, producing a 100% key event rate across all sessions. To GA4, that looks like the business is performing extraordinarily. To Google’s Smart Bidding algorithms, which use key events as the signal for bidding optimisation, it means the algorithm is optimising toward sessions rather than purchases.
This single misconfiguration means every paid campaign on the account has been bidding on the wrong signal. The paid media budget has been optimising for arriving on the site, not for buying from it. This isn’t a GA4 problem — it’s a GTM configuration problem. But you only find it by interrogating the event data rather than accepting the headline metrics at face value.
Similarly, on-site search showed zero revenue attributed to any search term across both years of data. Not low revenue — zero. For every one of the 75,707 unique search terms generating over one million sessions, there was no connection between the search journey and the purchase event. This means the team had no visibility into whether search was converting, which search terms were driving revenue, or how to optimise the Algolia configuration for commercial outcomes. The tracking gap was larger than any individual UX problem.
The Four Questions GA4 Event Tracking Answers That Dashboards Don’t
Having gone through this process, here’s how I’d frame the value of proper GA4 event instrumentation for any ecommerce team:
Where exactly is the journey breaking? Session-to-purchase is a long chain. Events let you see precisely which link is snapping — not just that the chain broke. add_shipping_info declining -31% while begin_checkout is stable tells you the problem is at the delivery information step, not at cart creation or payment.
What are customers trying to do that the site won’t let them? High filter and sort usage, search term fragmentation, and readReviews spikes are all signals of unmet intent. The customer wants something; the site isn’t giving it to them. Events make this legible at scale.
What’s working that you should do more of? video_complete at +399% is an instruction: produce more video. High ATC rates on specific SKUs point to the content and pricing architecture that’s resonating. Events surface the things worth scaling, not just the things worth fixing.
Are there systemic failures hiding below the revenue line? orderFailed, gtm.pageError, and 100% key event rates are all invisible in revenue dashboards. They’re only visible in the event stream. Regular event audits — quarterly at minimum — should be standard practice for any ecommerce business running above a few million in annual revenue.
What This Means for Roadmap Building
The most valuable outcome of a proper GA4 event audit isn’t a list of things that are broken. It’s a prioritised, evidence-backed roadmap where every action is anchored to a specific data signal.
“We should add more reviews” becomes “we should launch a Feefo incentive programme targeting the 30 highest-traffic zero-review furniture SKUs, because readReviews tripled and the best-reviewed product converts at 2× the category average.”
“We should improve checkout” becomes “we should diagnose and fix the orderFailed event which has risen 1,240% year on year, representing an estimated £X in daily lost revenue at our average order value.”
“We should invest in content” becomes “video_complete events rose 399% but only 3% of furniture PDPs have video — brief the content team against a priority list of the top ten SKUs by view volume.”
The difference isn’t just precision. It’s that stakeholders can see the evidence behind the recommendation, understand the commercial case for it, and make informed trade-offs about sequencing and investment. That’s a fundamentally different conversation than “our conversion rate is down, let’s fix the UX.”
A Note on What Good Looks Like
Before running this audit, I’d suggest any ecommerce team ask themselves four questions about their current GA4 setup:
Is your key event rate between 2% and 8%? If it’s above 10%, your Smart Bidding is likely misconfigured.
Does your on-site search data show revenue attributed to search terms? If not, you have a tracking gap that’s costing you optimisation visibility.
Are you tracking orderFailed as a custom event? If not, you’re flying blind on checkout technical failures.
Are you reviewing event volumes YoY, not just conversion rates? A 77% drop in PDP interaction events is invisible in a conversion dashboard but unmissable in an event audit.
GA4’s event model is genuinely powerful. Most teams are using about a tenth of it.
This article draws on findings from a Q1 2026 GA4 audit conducted for a UK baby and nursery retailer. All statistics are drawn from anonymised GA4 data.