The Future of Measurement

Introduction

I was recently sitting with an exasperated technology executive at a well-known, quick-serve restaurant chain, and he said something I think sums up the frustration of anyone in digital marketing: “Why can’t we just measure things accurately?” Truthfully, I was at a loss. Besides a pandering and evasive sounding “It’s complicated,” how do you convey an encyclopedia of technology and policy evolution that has taken place over a decade that also provides the context necessary to understand signal degradation that’s pulling us backward in measurement efficacy?

The truth is privacy policies make marketing measurement harder and more opaque by design. Ever since that conversation, I’ve been ruminating on how to succinctly summarize the state of our industry. The closest I’ve heard is: “Measurement is the past; the future is signal.” But beyond pithy statements, executives need context to understand how to prepare for industry changes in marketing measurement. This article aims to discuss current market trends and postulate on the likely future of digital marketing measurement.

Seven industry trends

1. The self-attributing network (SAN) starts to grade its own homework

Starting in 2017 Meta —  then Facebook — rolled out a new concept of advertising reporting, naming itself a self-attributing network (SAN). Following a rise of last-touch attribution becoming the default methodology for marketing measurement, Facebook decided it needed to show advertisers its value to acquisition beyond simply being the advertiser that last touched the consumer. So Facebook started to “self-report” if it had influenced (i.e., shown an ad to) a user within a lookback window. Google and Twitter quickly followed suit. Obviously, SAN reporting has merit. Facebook can and does influence customer acquisition, even if it isn’t credited with the last touch. But self-reporting also comes with drawbacks. When these leading ad platforms — which make up approximately 80% of ad spend growth — report their own outcomes, you invariably have multiple platforms claiming credit for a single customer. It begs the question: How do you measure advertiser efficacy if they all claim ultimate responsibility for the sale? This paradigm left advertisers with the dubious task of deciding how to weigh the output of SAN reporting. 

2. User privacy creates the concept of platform attribution

In 2020, Apple launched App Tracking Transparency (ATT), which started the deprecation of identifiers for deterministic user tracking on iOS. In 2024, Google Chrome started to deprecate cookies, beginning a slow march that will eliminate deterministic tracking of users on the web. Android’s deprecation of the Google Advertising Identifier (GAID) will soon follow. This represents a fundamental shift in how digital advertising will work; advertisers can no longer deterministically track a user across the digital landscape. 

To meet the industry’s need for a new method of measurement, Apple and Google have introduced attribution methodologies — SKAdNetwork (SKAN) and Google Privacy Sandbox, respectively. In short, these methodologies return aggregate attribution results that preserve user privacy. Because the privacy controls obfuscate the individual identifiers, the outcome of these methodologies is, by design, incomplete and imperfect. The advertiser is left with incomplete, broad results that increase marketing measurement and optimization uncertainty.

3. SANs develop ML-driven advertising

In the wake of user ID deprecation, Facebook lost $12B in ad revenue, mainly because it lost efficacy in driving advertiser return on investment on the iOS platform. In just two years, Meta and Google — the undisputed leaders in the advertising space — rebounded by building targeted advertising driven by machine learning (ML) to overcome, or at least mitigate, privacy-centric issues. This was a fundamental shift for these companies; Meta’s Advantage Shopping Campaigns and Google Performance Max are different from traditional performance media products in that they require much less input, such as optimization and targeting, from advertisers. 

A byproduct of these ML-driven products is the need for new data streams. Both Meta and Google have released APIs — Meta’s Aggregate Events Measurement and Conversion API and Google’s GBRAID — that allow them rich, privacy-centric data signals from the advertiser that help these algorithms react and optimize campaigns. These APIs are game changers for these platforms and their customers, as they allow access to valuable data for campaign optimization that are far more effective than privacy-attribution methodologies, such as SKAN and Privacy Sandbox.

4. Advertisers increasingly use broad-stroke modeling

Another consequence of decreasing signal, increased confusion, and multiple sources of truth is that advertisers are increasingly leaning on macro modeling to understand the holistic impact of their marketing efforts. These statistical models use broad inputs like marketing spend and earnings to provide comprehensive outcome predictions. Media mix modeling, while certainly not a new methodology, is the most widely explored example and represents an interesting companion to the more granular, channel-specific methodologies.

Traditionally, these broad-stroke models were primarily used for budgeting and benchmarking, requiring heavy lifting and refinement yearly or quarterly. But thanks to the ready availability of low-cost compute resources and privacy factors eroding measurement efficacy, these methods are receiving attention and interest for more practical performance applications. Leading companies are using the outputs from these broad-stroke statistical models to best understand the eroding deterministic methods of measurement.

5. Advertisers are developing in-house data analysis capabilities

Forward-looking organizations recognize the added market complexity and are developing a new expertise within their marketing performance teams. This so-called marketing economist role is to become adept in taking in multiple data sources to help arrive at a trusted outcome. This expertise isn’t developed easily and it certainly doesn’t happen overnight. Much like 15 years ago when a team solely focused on mobile user acquisition didn’t exist, this is a new paradigm. A few leading companies have already started investing in-house to build out teams with this expertise; the rest will outsource this knowledge. Over time, most players develop the expertise in-house, while technology platforms will grow to offer tools and applications to help this emerging discipline.

6. The deprecation of cookies will combine web and mobile app measurement

Mobile web measurement has traditionally remained at arm’s length from mobile app measurement. Generally, marketing channels on web and mobile have remained very distinct, often using different tools to measure marketing outcomes among channels.

However, with Google’s deprecation of cookies, measurement of these channels will slowly be combined. Google Privacy Sandbox, the technology and framework introduced by Google as an alternative attribution methodology, provides a single API that functions across both platforms. This is the first large-scale unification of digital measurement across channels and is an early harbinger of more holistic measurement. 

7. Data clean rooms and other PETs emerge

While not new, the purpose of a clean room is to combine and share information in a controlled way (i.e., match user overlap without revealing user identities). At its core, a data clean room is a data warehouse that restricts the granularity and types of information accessible. In essence, the applications of a clean room are about as broad as those of a data warehouse, but practically one in which restrictions to data access are necessary.

The most common application is between an advertiser and publisher, where neither party wants to reveal the identity of their users. A clean room allows an advertiser to understand, serve to, or report on overlapping users. In most cases, both the advertiser and publisher have access to first-party data (e.g., an email address) and, through hashing, can match the overlap of these users without revealing the identity of this user.

A data clean room is the best known of the many emerging privacy-enhancing technologies (PETs). The goal is to maintain operational effectiveness while adhering to privacy policies. Therefore, the application of a clean room is usually best for when overlapping datasets are needed, but restrictions on that data are required.

Learn more in our blog: The Role of Data Clean Rooms in the World of Advertising

The arc of change

Let’s now explore how change is — and will likely — permeate our industry.

In 2020, the overwhelming theme was panic. Over time, fear has been alleviated by technical solutions, but, as you can imagine, with seven significant and deeply complex thematic trends, today’s overarching shared feeling in the industry is one of confusion. At this stage, virtually every marketing professional knows the current state of privacy affects the industry. Unfortunately, the marketing executive needs not only to understand the conceptual and technical changes – but also what to do about them. 

Adoption of multiple sources of truth

While no measurement is perfect, the evolutionary winner of attribution over the last decade was last touch. This gave marketers an imperfect-but-precise methodology of measurement. And, where the marketer was more advanced, it allowed for a highly customized multi-touch methodology to be built. The deprecation of deterministic identifiers makes the former more difficult and the latter impossible. 

The advent of technologies to overcome the restrictions of measurement means marketers are now faced with multiple sources of data, from multiple providers, that must be reconciled. Consider the privacy-centric attribution of platforms like SKAN, a subset of opt-in data that allows deterministic tracking and SAN reporting. This means the performance marketer has three sources of data they can use to triangulate a holistic view of performance.

Exploration of alternative sources

Inevitably, when facing signal loss, marketers will explore alternative methodologies for measurement. 

On one end of the spectrum, we’ve seen this take the form of in-house teams building statistical models that refine — or are refined — by other data sources. One of the most common examples is developing incrementality tests to help normalize multiple data sources. In a more direct approach, we’ve seen advertisers poll their end users at the point of sale to help refine and align attribution of marketing spend.

An emerging trend to follow is mega retailers partnering with an ad partner through a proprietary data connection for retargeting and measurement, as this serves as valuable insight into potential future technologies in the space.

Building expertise

Many of these emerging trends require intellectual horsepower applied in-house. Many advertisers already have some in-house or agency-provided data analysis capability. However, industry leaders are considering how new measurement paradigms will shift existing budgetary planning and operational workflows. This often takes shape in the form of an in-house data science team tasked with helping the marketing organization with budgeting and reporting. While this is an expensive and time-consuming project, it represents an investment in a complex and uncertain world of marketing measurement.

What this means for your business

Unfortunately, the shift to privacy pointedly makes marketing measurement more difficult. Ignoring the problem won’t make it go away. To remain effective amid the changes, successful businesses will adapt to continue to understand where and how to market to their users.

The first step is to understand how the changes will affect your business. The key is understanding how changes will impact your measurement and marketing efficacy before the changes negatively impact your business.

The second step is planning how to make appropriate changes with minimal impact on your business. No one buys privacy for privacy’s sake, but you do need to adhere to privacy rules while keeping your business running. This challenge isn’t limited to the marketing department; it spans the entire business. All across digital marketing, leaders are reporting to their companies that marketing measurement is getting less precise and tracking return on ad spend will get worse, not better. Ideally, you’re having these conversations before they happen, not explaining to the board why you can’t track return on advertising spend (ROAS) on last year’s budget.

Step three: Once you have a plan, you’ll need to remain flexible. Changes will continue to impact these emerging paradigms, so adaptation, experimentation, and testing will be key components in helping you gain and retain an edge in your marketing efforts. 

And last, consider your resources. Companies like Branch put tremendously talented people to work on solving these problems for — and with — our customers. Lean on your vendors to understand what others are doing — and what you could be doing better.

The good news is a paradigm shift will serve as an opportunity for innovation and a catalyst for emerging leaders in digital marketing. How you adapt today will set your brand up for success — or failure — in the future. To the marketing executive asking, “Why can’t we measure everything accurately,” perhaps the best response should be, “Neither can anyone else, so how do we use that to our advantage?”