Every time a new privacy bill drops, someone in ad tech writes a post about headwinds. I am going to write the other one. The privacy legislation environment in 2026 is the best operating environment an independent, methodology-first measurement company could ask for. Let me explain why.
Where the legislation actually stands
As of this writing, nineteen states have active comprehensive consumer privacy laws. California, Texas, Virginia, Colorado, Connecticut, Maryland, Minnesota, and a dozen others are all live. No two are identical. A patchwork of opt-out rights, data minimization requirements, and access obligations now applies to any company that touches personal data at scale.
At the federal level, the American Privacy Rights Act expired at the end of the 118th Congress without passing and has not been reintroduced as of early 2026. Two weeks ago, House Republicans introduced the SECURE Data Act — a comprehensive national framework that would establish data minimization requirements, consumer rights to access and delete, and opt-out rights for targeted advertising and data sales. Its explicit goal is to replace the state patchwork with a single federal standard. It is early in the legislative process and faces the same bipartisan friction that killed APRA, but the direction of travel is clear.
The consensus view is that this landscape creates compliance costs, restricts data flows, and narrows the targeting capabilities that ad-supported media has relied on for twenty years. That consensus is wrong for a specific class of company.
Why patchwork privacy law is a moat for methodology-first measurement
Every state privacy law that passes applies the same pressure to the same category of data: identifiable, device-level, individual-level behavioral records. The company that holds a list of device IDs matched to real-world movement patterns and sells it to brands has a compliance problem. The company that built a synthetic population model — a privacy-preserving statistical representation of the entire U.S. population, with no individual linkage — does not.
This distinction is not hypothetical. The Motionworks measurement stack does not rely on device-identifiable behavioral data sold to third parties. What we measure is population movement, modeled from aggregate signals and calibrated against independent ground truth. The output is a population-level estimate with a stated confidence interval — not a dossier on an individual. No CCPA deletion request touches our methodology. No state opt-out law narrows our coverage. No SECURE Data Act data minimization requirement applies to a model that was never built around individual tracking in the first place.
While competitors are engineering around privacy constraints, we are not constrained. That is the moat.
The patchwork actually tightens the case for a neutral standard
Nineteen state laws with nineteen different frameworks is not a stable equilibrium. Brands planning national campaigns need a measurement currency that works in Texas and California and Minnesota simultaneously, with a consistent methodology that can be audited against the same standard in every market. A state-by-state patchwork of consent requirements and data restrictions makes that harder for any measurement provider that relies on raw behavioral data to produce its numbers.
It makes it easier for us. A synthetic population model is, by definition, a national model. It is not assembled from state-specific device panels. It does not have a California-flavored dataset and a Texas-flavored dataset that require separate compliance treatments. The same methodology, the same model version, the same confidence interval — everywhere.
When the SECURE Data Act or its successor eventually passes a national standard, it will define the floor. Companies built on methodology-first, population-level measurement will already be above it. Companies built on device-level tracking will be running to catch up.
Privacy pressure is accelerating the end of the impression as a measurement unit
This is the glass-half-full argument that I think gets missed in most privacy coverage. Opt-out requirements, data minimization, and consent frameworks all degrade the precision of individual-level targeting. When targeting precision drops, impression counts become less defensible as a proxy for outcomes.
This is good for measurement. The industry has to replace "I served you 10,000 impressions" with "here is what that audience actually looked like, here is the confidence interval on that estimate, here is where that audience went next." That is exactly the kind of measurement infrastructure we built. The privacy era does not shrink the market for rigorous audience measurement. It creates demand for a kind of measurement that privacy-preserving methodology is uniquely positioned to supply.
The one thing that would be a genuine headwind
In the interest of not over-rotating: there is a version of federal privacy legislation that would genuinely concern me. If a national law treated aggregated, modeled population data with the same restrictions as identifiable individual behavioral data — if it applied individual opt-out rights to a synthetic population model that contains no individual records — that would be a classification error with real consequences.
I do not think that is where the legislation is heading. The SECURE Data Act, like APRA before it, is focused on personal data — data that identifies or is reasonably linkable to a specific individual. A population model built from synthetic persons does not fit that definition. But it is the argument worth watching. The quality of the methodology definitions in federal legislation will matter more than the headline restriction level.
The bottom line
Nineteen state privacy laws and a federal bill in play is a stress test. For measurement companies built on individual-level device data assembled without clear provenance, it is a real problem. For a company that built its model from the ground up on privacy-preserving population methodology, independently audited and openly documented — it is a differentiator.
We spent six years building the right way before it was obvious that the right way was also the defensible way. Privacy legislation is not a headwind. It is the market catching up to our architecture.
The SECURE Data Act was introduced by House Republicans on April 22, 2026. Nineteen states had active comprehensive consumer privacy laws as of early 2026. APRA expired at the end of the 118th Congress in January 2025 and has not been reintroduced. The Motionworks methodology is publicly documented at www.mworks.com/methodology.