Rethinking marketing’s relationship with data

Years of optimization and personalization have left marketing chasing metrics over meaning. Clean data offers a way to rebuild trust.

Chat with MarTechBot

Data was meant to make marketing smarter. Somewhere along the way, it made us forget what we were trying to understand in the first place.

The hollow machine 

For most of my career, I believed marketing was a bridge between creativity and commerce — between what brands make and what people truly want. I held senior roles in digital marketing, leading campaigns that reached millions. I had every tool: dashboards, KPIs, analytics and insight engines. I believed that if we measured, optimized and personalized enough, we could make marketing scientific.

But something was off. The dashboards looked impressive — full of graphs and metrics — but they were meaningless. We’d celebrate a 3% lift even when nothing really changed. The data didn’t connect to revenue, loyalty or human connection. Multimillion-dollar command centers ran on unverified or fabricated data. Campaigns on platforms like Facebook and Google relied on metrics that barely reflected real outcomes. We were guessing rather than knowing, pretending instead of understanding.

Over time, marketing shifted from asking why people feel something to how to make them feel something — usually urgency, envy or inadequacy. We called it engagement, but it was manipulation, systematized at scale. Inside the machine, departments competed for credit instead of alignment. Agencies chased awards instead of impact. We built campaigns to impress each other, not to serve the people we claimed to understand.

When I left corporate life to start my own consulting, I saw what my career had obscured: misaligned models, denial-driven cultures, products nobody needed and stories nobody believed. Beneath it all was the force no one wanted to name — dirty data — fueling fake precision and manufactured insight.

But the deeper problem was corruption of incentive. And nowhere is that illusion clearer, or more dangerous, than in the dirty data economy that sells the fantasy of control while quietly extracting everything you’re worth.

Dig deeper: Why the grift economy is killing both marketing and marketers

How the dirty data economy owns you

Have you ever downloaded one of those apps that promise to pay you for answering a few surveys? I recently tried one called Surveys On The Go, which, according to its homepage, is “the nation’s largest, highest-rated survey app.” The landing page is exactly what you’d expect: a collage of stock-photo smiles radiating the kind of generic happiness only a marketing department could buy.

At first glance, it seemed harmless — answer a few questions, earn a few dollars. But instead of just taking a survey, I ran the app’s Terms of Service and Privacy Policy through the Clean Data GPT I built to see what I was really agreeing to. What it found was staggering.

More than 25 conditions favor the company, including background data collection through geolocation and sharing it for behavioral analysis. Your opinions, habits and movements are monetized, giving you pocket change while creating valuable long-term behavioral profiles.

Here’s what the fine print you’ll never read — but should — really says:

  • Over 10,800 words of terms and 6,200 of privacy policy: Longer than a novel, and you’re expected to absorb it all with a single tap.
  • “I agree” = Total surrender: You hand over survey answers, device ID, location, demographics, browsing history, app usage and even inferred behaviors.
  • A global, perpetual license: They can use, modify and sell your data forever without further approval.
  • Unnamed partners: Advertisers, data brokers and other third parties you’ve never heard of and can’t opt out of individually.
  • Rules can change anytime: Keep using the app and that counts as consent, even if you never saw the updates.
  • No real opt-out: Deleting the app doesn’t delete the data already collected or sold.

This same system fuels Big Tech — Meta, Google, Amazon — and data brokers like Experian, Oracle and Acxiom. Their operating principle remains the same: overwhelm users with fine print, collect every possible data point and keep it indefinitely. Every retargeted banner and “recommended for you” prompt is industrialized intimacy.

  • Meta’s Terms of Service, for instance, give it the right to use your content and interactions to train its algorithms, target you with ads and infer your emotions even after you delete a post.
  • Google’s Privacy Policy allows it to combine your search history, location data and Gmail content to build a unified behavioral dossier that advertisers bid on in real time.
  • Amazon tracks not only your purchases but also your browsing hesitations (i.e., how long your cursor hovers over a product) to predict and influence buying decisions.
  • Data brokers quietly buy, blend and resell thousands of attributes about each person, income level, political leanings, relationships and health conditions, to anyone willing to pay. Most of this happens without consent, without awareness and without compensation.

Apps love to talk about rewards and user value, but it’s really data extraction dressed up as generosity. They want compliant users, not informed ones. That’s why consent is reduced to a single tap: “Accept.” Just like that, they own your clicks, location, moods and patterns. You’re nudged, pinged and gamified into submission. FOMO, scarcity timers and streaks are designed to hook users, not to help them.

Your data is quietly traded in a shadow economy you never opted into, fueling campaigns that feel more like surveillance than service. Personalization is just a polite word for profiling. People aren’t engaging because they feel understood. They’re reacting because they feel watched. These systems erode loyalty and breed fatigue, anxiety and the creeping sense that you’re not using the app — the app is using you.

Prioritizing transparency and fairness over extraction has never been more urgent. The current system is working exactly as designed: to collect, conceal and profit. Reversing that will take more than privacy patches or PR-friendly consent updates. It requires rethinking who owns data, who controls it and who benefits from it.

Dig deeper: Privacy is the new currency in digital marketing

Building the clean data economy

That realization led to the creation of the Marketing Accountability Council (MAC), meant to confront an industry that had lost its conscience. MAC wasn built to expose marketing’s illusions — the myth of data-driven integrity, the corruption of consent and the performance of ethics without substance.

Across a series of MAC meetings, the facade began to crack. We faced the truth that our strategies relied on compromised inputs: dirty data, fraudulent metrics and black-box attribution models. We had traded truth for performance theater, operating inside a system that rewards manipulation — where adtech pretends to predict and brands pretend to care. Surveillance was sold as personalization, impressions as impact, dashboards as truth.

That reckoning changed the question from “how do we do better?” to “what do we build instead?” Accountability wasn’t enough. To rebuild trust, we needed infrastructure. That idea led to the Clean Data Alliance (CDA), an effort to rebuild the digital economy on transparency, truth and human agency rather than exploitation.

Together with MAC, CDA is creating a framework where verified, permissioned data can outperform deceit — what we call data agency. It gives people real control over their information and the right to decide who uses it and how. The goal is simple: make truth more profitable than deceit.

Why can’t I change it from the inside?

Few trade associations in marketing are taking this stand. Most protect the status quo. Groups like the Association of National Advertisers (ANA), the Interactive Advertising Bureau (IAB) and parts of the American Association of Advertising Agencies (4As) continue to frame responsible data use and brand safety as reform, while lobbying to preserve the same extractive practices that eroded trust.

In 2024, the ANA joined the Privacy for America coalition, a lobbying group funded by adtech intermediaries and data brokers. Its stated mission — to “create balanced national privacy legislation” — sounds reasonable, but the fine print tells a different story. The coalition opposes stronger regulation of data brokers and works to ensure companies can keep collecting and monetizing personal data without requiring direct consumer consent. Its stated position is that “responsible data-driven marketing benefits consumers and fuels the economy.”

The ANA’s own 2023 “Programmatic Media Supply Chain Transparency Study” found roughly 23% of digital ad spend — about $22 billion a year — disappears into opaque fees, fraud and unverifiable transactions. Instead of demanding accountability, the ANA called for “greater collaboration within the existing ecosystem,” sidestepping reform or regulation.

The IAB, representing adtech, has taken a similar stance. It opposed Apple’s App Tracking Transparency framework, claiming it would “harm small businesses,” though the change simply required permission before tracking users. Its CEO even described privacy advocates as “extremists” who “threaten the open internet.” That’s how resistant the industry remains to the idea of consent.

This is how the system protects itself. The incumbents aren’t neutral. They’re gatekeepers of a trillion-dollar surveillance economy who rely on our collective ignorance to keep the engine running. Their privacy frameworks are self-policing, their audits are selective and their alliances are designed to look ethical while keeping the money and the data flowing through the same hidden pipes.

That’s why the CDA had to be built outside the system. It wasn’t created to polish a broken model or add another “best practices” checklist. Its purpose is to replace extraction with empowerment — to prove that verified, permissioned and anonymous data — clean data — can outperform deceit at every level.

A new economy built on trust

Every marketer faces a choice: keep playing the game — chasing dirty data, hollow KPIs and fleeting clicks — or help build something better: markets grounded in transparency, consent and truth.

For me, there’s no looking back. Once you understand that trust is the only growth metric that truly matters, you stop chasing numbers and start building an economy rooted in clean data, fairness and human dignity.

Dig deeper: How to build customer trust through data transparency

Fuel up with free marketing insights.

Email:


Contributing authors are invited to create content for MarTech and are chosen for their expertise and contribution to the martech community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. MarTech is owned by Semrush. Contributor was not asked to make any direct or indirect mentions of Semrush. The opinions they express are their own.


About the author

Jay Mandel
Contributor
Jay Mandel is a multi-faceted entrepreneur, professor, consultant, coach, and author of "Brand Strategy in Three Steps (Kogan Page, 2023)." His transformative journey from corporate America to coaching reflects his commitment to infusing meaning and authenticity into the business world. With two decades of corporate experience, including a notable role as the former social media and content lead for Mastercard's global team, Jay's brand methodology is honed through a diverse range of corporate, entrepreneurial, and academic experiences. Armed with a Masters's in strategic communications from Columbia University, Jay is dedicated to guiding individuals and the companies they work for in pursuing clarity, strategy, and finding their unique market niche. Embrace growth, explore with purpose, and embark on a transformative journey with Jay Mandel today.