The Monopoly Report
Episode 68: How to Build a Privacy Program with Charm & Gravitas
Host: Alan Chapell
Guest: Sheila Cole Klaser (Red Barn Strategy, former Chief Privacy Officer at Acxiom and IPG)
Date: March 11, 2026
Episode Overview
This episode dives into the nuances of building effective privacy programs in high-scrutiny data environments, particularly within large, federated organizations. Host Alan Chapell speaks with Sheila Cole Klaser, a pioneer in data ethics and privacy, about her experiences at Acxiom and IPG, the true narrative around data brokers, the evolving challenges with AI and data governance, and the concept of "March Fairness"—a framework for thinking about privacy that centers on transparency and agreement between data subjects and controllers.
Key Discussion Points & Insights
1. Sheila's Privacy Journey & Early Learnings
-
Origin Story
- Sheila “stumbled” into privacy in 1997, moving from a policy role in Washington, D.C. to become Chief Privacy Officer for the Americas at Acxiom.
- Her charter: turn privacy principles into practical governance and create a privacy-forward culture.
- Learned from Jennifer Barrett Glasgow, credited as the world’s first CPO (appointed 1991).
-
Early Practice at Acxiom
- Emphasized thorough, hands-on diligence:
"If you don't go look and see what's going on, you cannot be certain." (06:58, Sheila)
- Example: personally reviewed 42 data sources to determine their suitability—found only 7 permissible; later, a third of sources in overall datasets were "black market or impermissible" and cleansed.
- Emphasized thorough, hands-on diligence:
-
Documentation & FTC Validation
- Rigorously documented practices, proving their legitimacy when the FTC investigated.
"If you don't have it written down, it doesn't count. It didn't happen." (08:57, Sheila)
- A clean bill of health led to public FTC praise labeling Acxiom a “white hat” data provider.
"All except one—Acxiom is a white hat." (10:12, recapping Commissioner Brill’s speech)
- Rigorously documented practices, proving their legitimacy when the FTC investigated.
2. Building Privacy Programs in Large, Complex Organizations
-
Transition to IPG ('Agency Holdco' World)
- Fascinating challenge moving from a single-company culture (Acxiom) to IPG with 80-90 semi-autonomous agencies.
- Needed a strategy to unite sprawling, varying agency cultures.
-
Top-Down & Bottom-Up Approach
- Coined “digital responsibility” instead of “privacy”—broader, more engaging.
- Initiatives included distributed governance, evangelism, culture-building with rewards/games, and the Words Matter campaign to encourage language reflecting respect and responsibility.
"When the hearts and the minds will follow." (13:22, Sheila)
- The core principle: never just say “no”; always say “yes, and here’s how to do it legally and ethically.”
-
On Influence:
"You have a certain gravitas, and you really also have a certain charm. And both of those things are really important if you want to move people..." (16:44, Alan)
"We're all people. ...think about the person behind the data or the person that is going to be impacted by the algorithm." (17:02, Sheila)
3. The Data Broker Narrative: Fairness, Misperceptions, and Scrutiny
-
Evolving Regulatory Focus
- The term “data broker” is pejorative—doesn’t fit most actors.
"The term data broker is pejorative and I don't like it. And it's not fair." (19:13, Sheila)
- Acxiom and similar companies didn’t collect dossiers, but solved specific business problems using carefully sourced data.
- The term “data broker” is pejorative—doesn’t fit most actors.
-
Why the Scrutiny?
- Brokers don’t have one-to-one relationships with individuals—their own clients are businesses, making them an easy policy target.
- Example: a misleading newsletter created a consumer panic by suggesting opting out of Acxiom would prevent all identity theft/fraud, when in reality Acxiom data was used for minor marketing segmentation, not sensitive activities.
"If you want to eliminate all identity theft and fraud in one motion, opt out of Acxiom." (22:15, paraphrasing newsletter claim)
-
Regulatory Nuance Lacking
- In California, only ~10% of registered brokers process data sensitive enough to trigger breach notifications, yet 100% are tarred with the same brush.
"The other 90% are just routinely being tarred with that same brush." (24:26, Alan)
- Sheila: controls, documentation, and rigid practices matter to clients, but “lawmakers and the activist community, it didn't matter too so much.” (24:48, Sheila)
- In California, only ~10% of registered brokers process data sensitive enough to trigger breach notifications, yet 100% are tarred with the same brush.
4. March Fairness: Elevating the Privacy Conversation
-
Origin & Story
- "March Fairness" was coined as a direct parallel to "March Madness," encouraging transparency and fair exchange in privacy, inspired by a childhood lesson in splitting chewing gum:
"Both people have to agree what is fair. It has to be fully transparent and both have to agree." (28:29, Sheila)
- "March Fairness" was coined as a direct parallel to "March Madness," encouraging transparency and fair exchange in privacy, inspired by a childhood lesson in splitting chewing gum:
-
Broader Principle
- Privacy discussions should be about balance and genuine choice, not just legal compliance or checkbox consent.
"The fairness test is: does the person about which the data relates and the use effects, do they agree this is a fair use of their data?" (17:02, Sheila)
- Privacy discussions should be about balance and genuine choice, not just legal compliance or checkbox consent.
5. AI, Data, and the Next Frontier of Privacy Risks
-
Is AI a New Risk?
"Both." (31:18, Sheila, when asked if AI poses new or just faster/bigger risks)
- AI amplifies old risks but also introduces novel ones, especially with scaling and the coming of quantum computing (“Q Day”).
-
Quantum Computing and Security
- Q Day: when quantum computers can break standard encryption, every data controller will need quantum-resistant security.
"...when Quantum comes online and there is no encryption software... Unless you have Quantum on your side, you will not be able to protect or govern [your data]." (32:25, Sheila)
- Q Day: when quantum computers can break standard encryption, every data controller will need quantum-resistant security.
-
Risk for Ad Tech and Data Governance
- Many in ad tech delude themselves that AI-driven contextual ads don’t process personal data—Sheila warns otherwise, especially when inferences are drawn.
- Cites a specific case where her team wanted to market health data inferences:
"...I pulled everyone's erectile dysfunction score for the men and women and vaginal itch score for the women. ...I'll read them to the room. And the, the executive team said, 'Stop. That's okay. Your point is well made.'" (35:13, Sheila)
-
Advocacy for Limiting Sensitive Data Use
- Both Alan and Sheila agree: the ad industry should eschew sensitive health, precise location, and similar data.
- Sheila: Yet acknowledges nuance for rare diseases or health searches that may benefit from thoughtful, targeted advertising.
6. U.S. Perspectives on Data Collection
-
Government vs. Private Sector:
- Americans tend to scrutinize government data collection far more than private companies’, while Europeans may be the reverse.
"Government collecting data about me. Consequential. Now, AI changes all of that." (38:41, Sheila)
- Americans tend to scrutinize government data collection far more than private companies’, while Europeans may be the reverse.
-
AI-Driven Transformation: Underestimated Risks
- AI is moving beyond current comprehension; already inventing languages, making decisions beyond human scrutiny.
7. Advice for New Privacy Professionals
-
Sheila’s Tips
- Align everything with the enterprise strategy—understand business risk appetite and adapt privacy programs accordingly.
"Whatever your practice area... had to map to the enterprise strategy. Because businesses are in business to be in business and to make money." (40:58, Sheila)
- Consider a “risk maturity model” for your organization: what level of risk is your company willing to accept?
- Align everything with the enterprise strategy—understand business risk appetite and adapt privacy programs accordingly.
-
Alan’s Additions
- Don't assume leaders always know what they’re talking about; seek out true expertise and build mentorships.
Notable Quotes & Memorable Moments
- "If you don't have it written down, it doesn't count. It didn't happen."
– Sheila Cole Klaser (08:57) - "The term 'data broker' is pejorative and I don't like it. And it's not fair."
– Sheila Cole Klaser (19:13) - "Both people have to agree what is fair. It has to be fully transparent and both have to agree."
– Sheila Cole Klaser (28:29, on 'March Fairness') - "AI is both a new category of risk and the same old data problems moving faster and at greater scale."
– Paraphrased/recapped from Sheila (31:15) - "We already have examples of the AI engines inventing language to talk to themselves and computing answers and outcomes that we don't understand."
– Sheila Cole Klaser (38:41)
Timestamps for Important Segments
- [06:58] — Lessons from Acxiom: diligence and documentation
- [13:22] — Building cross-agency privacy programs ("herding cats" at IPG)
- [17:02] — Importance of gravitas, charm, and data ethics in influencing privacy culture
- [19:13] — The “data broker” label: history, myth versus reality
- [22:15] — Consumer misperception: the newsletter event and its fallout
- [24:48] — Data broker registry, sensitive data, and fairness in scrutiny
- [28:29] — Story behind March Fairness: childhood gum parable
- [31:15 - 33:55] — AI risk: new versus old, quantum threat (Q Day)
- [35:13] — Inferences as data: real-world product veto on sensitive data scores
- [40:58] — Sheila's career advice for privacy/governance professionals
Closing Information
Find Sheila:
- Personal: sheilacalclasure@gmail.com
- Professional: Red Barn Strategy
Summary Takeaways
- True privacy programs require detailed, consistent documentation and a culture built on trust, transparency, and continuous verification.
- “Data broker” is an oversimplified, often unfair label that misses the complex reality and positive competitive role many such companies play.
- AI presents both new and amplified old risks—regulatory models must adapt accordingly, including for looming threats like quantum computing (“Q Day”).
- The standard for fair data use should be mutual agreement and transparency, not just legalistic checkboxes.
- Privacy teams must be pragmatic, aligning with the enterprise’s business strategy and risk appetite to be effective.
- Ad tech must move beyond compliance minimalism toward truly ethical use of data, with clear boundaries around sensitive categories.
This summary was created to provide a comprehensive yet accessible overview of the discussion, highlighting the most critical ideas, illustrative moments, and actionable advice for privacy professionals and industry observers.
