Measurement feels harder than it used to, and it is. Signals are disappearing, deterministic data is shrinking, platforms are relying more heavily on modeling, and customer journeys are fragmented across web, app, and everywhere in between. Meanwhile, AI is increasingly making optimization decisions with whatever data it’s fed.
And yet, expectations for performance haven’t changed. Teams are still on the hook to move fast, prove impact, and spend efficiently, even as the systems they’re operating within have fundamentally changed.
We recently hosted a MarketingDive webinar, The New Measurement Reality, featuring Branch experts Brett Orlanski and Harish Thimmappa — two people who’ve spent decades inside performance marketing and advertising, working directly with teams navigating exactly this shift.

A few patterns came to light: what’s tripping teams up, and what the ones adapting well are doing differently.
Here are five lessons that stood out.
1. What looks like a performance problem is often a signal problem
In a modeled environment, instability is usually a signal issue, not a strategy issue. Platforms need a certain volume and consistency of data to optimize effectively. When they don’t have it, campaigns stall in the learning phase, producing volatile costs per install (CPIs), fluctuating conversion rates, and uneven delivery. The system isn’t necessarily broken, but it doesn’t have enough information yet.
Where teams get into trouble is reacting too quickly to that volatility. They pull back spend, shift channels, or reset campaigns, and end up paying for the learning phase over and over again without ever reaching the performance phase.
The adjustment here isn’t necessarily to spend more. It’s to spend with more focus: fewer channels, more concentrated signal, and enough runway for the system to actually learn.
2. Early performance is a learning phase, not a verdict
There’s a natural instinct to evaluate campaigns almost immediately. Performance marketers often look at day-one or week-one performance and start making adjustments. That instinct made sense in a more deterministic world, where feedback loops were faster and more precise.
In today’s environment, early spend serves a different purpose: information gathering. During this phase, platforms are building an understanding of your users, your conversion signals, and how to find similar audiences at scale. That process takes time and requires a degree of stability to be effective. Swapping creative, changing targeting, or reallocating budget too early just sets the system back to the beginning.
3. Not all measurement challenges are external
It’s easy to look at the current landscape of privacy changes, signal loss, and increased modeling and assume that measurement issues are largely outside of your control. But a significant portion of inconsistency is still internal.
Small operational gaps compound quickly: events mapped slightly differently across platforms, attribution windows that don’t align, naming conventions that create duplication, time zones or currency settings that introduce subtle discrepancies. Individually, these seem minor. Together, they create just enough inconsistency to erode trust in the data. And once that trust breaks, decision-making either slows down or becomes reactive.
Getting your measurement house in order, according to Harish, means correctly implementing your SDK, mapping events accurately, cleaning up naming conventions, aligning attribution windows, standardizing time zones and currency reporting. None of this eliminates complexity, but it makes it manageable.
In many cases, what looks like “measurement chaos” is simply a lack of internal alignment. And that’s fixable.
4. Your dashboards aren’t broken, they’re just different
One of the most persistent sources of frustration right now is discrepancy between platforms. Numbers don’t line up, doubt creeps in, and teams hesitate or overcorrect.
Here’s what’s actually happening: Each platform is operating with a different view of the world. They have access to different signals, apply different attribution windows, and rely on different levels of modeling. Their reporting will never fully match, and some level of discrepancy is expected, even by design.
That doesn’t mean anything goes. Large variances — say, beyond 10% — can still indicate misalignment or technical issues worth investigating. But the goal isn’t perfect agreement. High-performing teams look for patterns across platforms rather than exact matches within them. They ask whether trends are moving in the same direction, whether cohorts behave similarly over time, and whether spend is being recouped within a meaningful window.
5. The real cost of weak measurement is compounding inefficiency
The impact of weak measurement rarely shows up as a single, obvious problem. It shows up as a series of small inefficiencies that compound over time — what Brett calls the 5 Hidden Taxes:
- The misattribution tax: Missing context across channels of influence can lead to over-investing in some channels and underfunding others.
- The broken journeys tax: Links that take users to the wrong place reduce conversion rates and interrupt the customer experience.
- The optimization lag tax: Measurement-to-response gaps delay learning and optimization, which causes budget shifts to happen weeks late and performance to plateau.
- The missing signals tax: Improper event mapping means important events (trial start, store visit, add-to-cart, subscription step) aren’t tracked or passed back to media.
- The misallocation tax: When touchpoints aren’t deduplicated, credit shifts to the wrong channels and budgets follow.
Individually, each of these might seem manageable. Together, they create a meaningful drag on performance — and none of them get solved by spending more. They’re solved by improving the quality and consistency of your measurement foundation.
The takeaway
Measurement hasn’t broken, but the assumption that you could rely on perfectly aligned, deterministic answers no longer holds. What’s replacing it is a system that requires a different approach: more patience, more structure, and a clearer understanding of what signals actually matter.
The teams adapting fastest are the ones who have made that shift. They’re focusing their budgets instead of fragmenting them, giving platforms time to learn, fixing what they can control internally, and making decisions based on directional confidence rather than perfect certainty.
