
The attribution lie
Every attribution model - last-click, data-driven, even Google's new "AI-powered" version - answers the wrong question. They tell you which touchpoints correlated with conversions, not which ones caused them.
The result: most teams overspend on bottom-funnel retargeting and brand search, and starve the top-funnel channels that actually drive new demand.
What works in 2026
Two methods, used together:
- Marketing Mix Modeling (MMM). Statistical modeling on aggregated weekly spend and revenue data. Privacy-safe, works without cookies. Open-source tools like Meta's Robyn and Google's Meridian have made it accessible to teams without a data science department.
- Geo-based incrementality tests. Switch off (or scale up) a channel in matched geographies and measure the lift. The gold standard for proving causal impact.
A pragmatic measurement stack
For most growth-stage brands:
- GA4 + server-side tagging as the baseline event layer.
- MMM run quarterly (Robyn or a vendor) to set channel budget allocations.
- One incrementality test per quarter on the channel you're least sure about (usually Display, YouTube, or Influencer).
- Multi-touch attribution as a directional signal, not a budget decision tool.
What you'll discover
Most teams who run their first incrementality test find:
- Brand search: 60-80% of attributed conversions would have happened anyway.
- Retargeting: 30-60% incrementality only - much lower than reported ROAS.
- YouTube and influencer: higher true contribution than last-click attribution shows.
This usually triggers a 20-40% reallocation of budget from bottom-funnel to mid- and top-funnel channels - and a step-change in growth.
Start small
You don't need a six-month MMM project. Start with one geo holdout test on the channel you suspect is the most overrated. Within four weeks you'll have a number that changes how you spend.


