Podcast Summary: "What the Hack?" Episode 240 – Who’s Watching the Watchers?
Host: Beau Friedlander (A)
Guest: Vermont Attorney General Charity Clark (B)
Date: February 24, 2026
Episode Overview
In this episode, "What the Hack?" explores the intersections of surveillance, privacy, and the expanding influence of data collection—both government and corporate. With Attorney General Charity Clark of Vermont, the discussion unpacks not just the technicalities, but the core values, risks, and responsibilities that underpin privacy in 2026. The team highlights the normalization of privacy violations, the rapid advancement of AI, the ethics around cryptocurrency, and the special vulnerabilities of children within current educational tech systems.
Key Discussion Points & Insights
1. Why Privacy Matters—Philosophical and Practical Dimensions
[03:25]
- Philosophical: Do people have an inherent right to privacy and freedom from surveillance?
- Practical: Personal data exposure increases vulnerability to identity theft, scams, and extortion.
“Do we believe that each of us has a right to privacy?...That I think is a part of America.”
— Charity Clark ([03:28])
2. Government, Big Data, and Ethical Dilemmas
[04:34]
- Discussion of the Trump administration’s casual approach to data privacy, illustrated by the appointment of under-vetted individuals to oversee sensitive information within government initiatives such as Doge.
- The blurred lines between legal permissibility and ethical corruption, especially regarding cryptocurrency and personal benefit for officials.
“You're gonna let this person have access to all of our highly sensitive data? It did not instill confidence...”
— Charity Clark ([05:24])
- Concerns over cryptocurrencies—most frequently used for speculation, scams, and dark web activities.
“There's a reason why cryptocurrency is most used right now … for three things. So speculators … scams … [and] buying stuff on the dark Web.”
— Charity Clark ([08:22])
3. The Swagger Problem: Tech Elites & Accountability
[06:32]
- Problematic mindset among tech and political elites: “The rules don’t apply to me because I’m so great.”
- Similar unchecked confidence surfaces in cryptocurrency and AI sectors, dismissing concerns as ignorance rather than legitimate skepticism.
"... that's not a me thing. That's a you thing ...the other one is artificial intelligence."
— Charity Clark ([06:55])
4. Rapid Tech Outpacing Legislation—AI as a Case Study
[11:58], [12:42]
- AI's rapid advancement creates risks that far outstrip the pace of legal and ethical guardrails.
- Charity Clark advocates for both proactive state legislation (e.g., anti-revenge porn laws incorporating deepfakes) and increased lawmaker education on AI applications and risks.
"It's honestly such a mess... corporations ... are willing to take risks and violate laws because they gotta get there."
— Charity Clark ([12:42])
- Europe is seen as a leader, while the US scrambles to catch up locally and federally.
5. Framing Data Abuse: From Damage to Values
[16:40], [17:03]
- Host suggests focusing regulation on cases where individuals are harmed.
- Clark suggests starting at the root: embracing a general right to privacy and viewing privacy as foundational, not just as a response to individual harm.
“What if ... we start at the beginning, the philosophy. We believe in privacy, believe people have a right to privacy.”
— Charity Clark ([17:03])
6. The Perils of AI—From Science Fiction to Daily Risks
[20:08], [21:35]
- Discussion of art and science fiction as tools to help the public imagine and anticipate AI dangers (e.g., romance scams, privacy intrusions).
- Economic interests (national or corporate) frequently swamp caution.
- Charity forecasts a surge in romance scams powered by AI chatbots, warning of vast, hard-to-detect exploitation.
“The romance scam using AI chatbots is a deadly combination, and it is coming.”
— Charity Clark ([21:38])
7. Everyday Privacy Violations—The “None of My Business” Principle
[24:31], [26:36], [27:18]
- Social scrutiny and viral exposure (e.g., viral Jumbotron mishaps) are contextualized as privacy invasions.
- Clark humorously applies Vermont’s motto (“mind your own business”) to modern privacy: what consenting adults do (or even their embarrassing collections) should remain private.
“That’s why, to me, it starts with, kind of, the philosophy... 'whereas mind your own business.'”
— Charity Clark ([24:31])
8. Student Data and the “Privacy Before Consent” Problem
[29:29], [32:54]
- Expanding on how children’s data is now collected through edtech applications from the earliest ages, often without informed family consent.
- Parents are urged to be proactive: request privacy policies from schools, consider opting out, and know their legal rights (like refusing to provide Social Security numbers without need).
“It’s okay to ask for… the data privacy statements on various apps the school is using.”
— Charity Clark ([29:47]) "The violation of privacy has been so normalized that people think they're being like a Karen if they say, no, I don't want that information shared about my kid..."
— Charity Clark ([32:54])
9. The Normalization of Surveillance—From “Creepy” to Expected
[34:17], [35:30]
- Personal anecdotes about Gmail reading emails to target ads, once “creepy,” are now normalized and expected.
- Americans are currently negotiating where to draw the line on acceptable surveillance and manipulation.
“Now it’s just what we expect. And in fact, it’s what we sign up for…”
— Charity Clark ([34:44])
10. State Privacy Legislation—Obstacles and Progress in Vermont
[36:36], [39:09]
- Vermont’s attempts to pass comprehensive privacy laws have been stymied by tech lobbying, legislative knowledge gaps, and political resistance.
- Republican and Democratic lawmakers in Vermont share concerns, debunking stereotypes.
"...big tech has spent a lot of money trying to kill privacy bills..."
— Charity Clark ([37:56])
- Clark’s legislative wish list: comprehensive biometric data protections and meaningful AI regulations inspired by other states and Europe.
“I would love to see specific protections to protect our biometric data…In addition, I’m ready for movement on AI.”
— Charity Clark ([39:09])
11. Media, Social Media, and the Participatory Loss of Privacy
[41:20]
- Social media cannot simply be legislated away because it is a news source for many, but society must reckon with how individuals themselves facilitate privacy erosion.
“We are participants in the destruction, removal, risk of our own privacy.”
— Charity Clark ([41:47])
- Emphasis on empowering people with awareness and control over their own data.
Notable Quotes & Memorable Moments
- “We allow private companies to do things we would never tolerate our government doing.” ([01:17])
- “The violation of privacy has been so normalized that people think they're being like a Karen if they say, no, I don't want that information shared about my kid.” ([32:54])
- “The romance scam using AI chatbots is a deadly combination, and it is coming… Extremely easy to imagine.” ([21:38])
- “How can you be free if you’re being surveilled?” ([41:47])
- “If you want to reach [Vermont] community…[Facebook is] where people got the news.” ([41:47])
Important Timestamps
- 03:25 - Why everyone should care about privacy
- 05:03-06:43 - Why government trust breaks down over data access and crypto
- 11:58-12:42 - Tech laws lagging behind AI and corporate risk-taking
- 16:40-17:03 - Framing AI legislation: damages vs. philosophical rights
- 21:38 - The imminent danger of AI-driven romance scams
- 24:31-27:18 - Everyday privacy, social shaming, and Vermont’s “mind your own business” ethos
- 29:29-32:54 - Children, edtech, and parent activism
- 34:17-35:30 - Normalization of surveillance in daily life
- 36:36-39:09 - Status and obstacles of Vermont privacy legislation
- 41:47 - Participatory aspect of privacy loss in the digital age
Paranoid Takeaway – The Tinfoil Swan [43:37]
- Parents have genuine agency to protect their children’s data: request paper communications ("backpacking notes") rather than participating in student data apps.
- Read privacy policies before enrolling children in apps; don’t feel pressured to comply with every digital request.
Tone & Language
This is an incisive yet approachable discussion, balancing rigorous legal insights with relatable stories and humor. The speakers urge listeners to reconsider the normalization of privacy violations, push back against digital fatalism, and advocate for both collective and individual action—especially in defending the privacy of the next generation. The episode is laced with wry observations and practical takeaways, making a complex and often bleak subject both accessible and actionable.
