Loading summary
A
Hey everyone. We are working on an upcoming series that dives deep into surveillance technology with some really incredible guests, including Ben Jordan and some other surprises. You're gonna love it. It's a story that looks at surveillance privacy and what happens when the technology sold to us to keep us safe starts changing how we live. But before we get there, we wanted to bring back a conversation with Vermont Attorney General Charity Clark that helped set the stage. Because when we talk about surveillance, privacy, data tracking, or the kinds of tools you'll hear about in the upcoming episodes, the real question isn't just what the tech can do or how it's built or not built well. It's what we believe about privacy, about power and who's watching in this so called post privacy. Who cares who has this or that information about us world? There are some fierce holdouts.
B
The violation of privacy has been so normalized that people think they're being like a Karen if they say, no, I don't want that information shared about my kid.
A
This week we're talking to one of them, Vermont Attorney General Charity Clark, who's taking on big data, among other things.
B
We allow private companies to do things we would never tolerate our government doing.
A
In a wide ranging conversation that goes there.
B
I could literally be living my life, making a sandwich, doing whatever I want. But my, my evil AI bots could be romancing victims by the thousands. Extremely easy to imagine that happening.
A
If you lose sleep over shady crypto deals, AI overreach, and the future of privacy in America. You might want to skip this episode. I'm Beau Friedlander and this is what the hack, the show that asks, in a world where your data is everywhere, how do you stay safe online? Charity Clark, welcome to the podcast.
B
Thanks for having me.
A
Charity Clark is Vermont's Attorney General and she is a friend of the pod and she's also doing amazing work on behalf of Vermont and through that work, the whole country. Welcome, Charity.
B
It's so nice to be here.
A
It's so nice to have you here. It's comforting because I have a lot of questions and I of course follow what you're doing up there in Vermont pretty closely. And if you want to, you can too, on Instagram and Twitter and other social platforms where you can stay up to date. When we talk about data privacy, it can sound a little abstract, like it's something only tech people or lawyers need to worry about. But the truth is your personal data is everywhere. Your health records, your location, your face, your habits. Someone is collecting all of it, often without your permission. And sometimes it's the government, right? Sometimes it's a scammer. Sometimes it's a giant tech company selling you shampoo based on an email you wrote to your cousin. So why does it matter? Why should the average person, someone who isn't doing anything wrong, still care about what happens with their data?
B
I think there's a lot of reasons, but I'm going to focus on two big ones. The first is philosophical, and that is, do we believe that each of us has a right to privacy? Do we believe that we should be free from surveillance, free from intrusion on our financial data, our health data, all of that kind of ethos that. That I think is a part of America. And that's the first bucket. The second bucket is more practical. As an attorney, as a consumer attorney, it's easy for me to talk about that, too. And that is that our data makes us vulnerable to theft and scams. In a nutshell, protecting our data protects us from identity theft, from scams, from having extortion. Data privacy is something in today's world that is important because our data is so easily everywhere, and it is so easy when you take someone's data to use it for nefarious ends. So from a practical perspective, data privacy is also really important.
A
So I want to dig in here, because this year, 2025, we've seen the Trump administration access some of the most sensitive data the government has on us in ways I haven't seen before. While Doge has been framed as a way to stop government spending or slow it down or get it under, you know, in control, with an eye toward easing bureaucratic overreach, it has the look and feel of a big tech smash and grab. Talk about that.
B
Well, I mean, I would just say, generally speaking, so far, the Trump administration has not proven itself to care a lot about our privacy and about data. I mean, there is one little fact in one of the very early lawsuits that we filed, which was about Doge in installing workers in positions they were not qualified for, or installing Elon Musk as the head of Doge without going through the appropriate process. And the one fact that just I really think about literally lives rent free in my head is that worker who was not even Googled and then come to find out, like, his very public, you know, Twitter account or something had all of these super offensive things in it that would disqualify him from holding a position like he had been hired to do. And I just think, like, the average person knows, like, you don't even go on a date with someone until you Google them. Like, but you're gonna let this person have access to all of our highly sensitive data? It did not instill confidence in the administration, which, let us not forget, had already been in place for four years previously. It's like, they should be the experts on this stuff, and they didn't even Google. I mean, and that's just the beginning. And I think from there, we've repeatedly seen a disregard for the privacy of Americans.
A
This idea of swagger, this idea of, yes, there's rules, but they don't apply to me, because if you understood how great I am, you would understand, oh, my God. That I'm not going to do anything wrong.
B
I. Oh, the swagger. I mean, some of the topics that I hope we touch on today, I. I feel like are. Are cousins of this same problem, which is cryptocurrency. You don't get how amazing cryptocurrency is because you're not as brilliant as I am. You know, instead of, like, I'm just not piecing it together, and it's like, that's not a me thing. That's a you thing. And the other one is artificial intelligence. You know, when it comes to artificial intelligence and the risk of our data privacy being used against us, being used for nefarious ends, or just generally AI being used to affect scams at a massive scale and destabilizing the entire economy, people don't want to hear it. You know, they want to think like, well, you just don't get it because you don't understand artificial intelligence. So same kind of swagger in that arena as well.
A
Okay, so you're AG Of Vermont, and I then, therefore will just ask you point blank, legal or not legal, to have your own cryptocurrency that people around the world can buy, and it. And it directly benefits the President of
B
the United States without being an expert on federal ethics laws. I think anyone can see you don't need to be an expert to see the potential for corruption in that scheme. And the cryptocurrency, the. The. The meme coins, like all of these seem very problematic to me. I mean, there's a reason why cryptocurrency is most used right now, as far as I can tell, for three things. So speculators are trying to get in early so they can sell later. Number two scams, right? Probably over half of the top 10 scams where people really got scammed and large quantities of money involve cryptocurrency. And the third is buying stuff on the dark Web, you know, that's where cryptocurrency comes into viewpoint. I don't know where else people are using their cryptocurrency for. And so I just think, what. How are we even here with the President being such a champion of cryptocurrency and then he himself and his family benefiting from cryptocurrency? It really boggles the mind.
A
Wait, but are you talking about the same person who started Trump University, which was. I think you are.
B
Don't forget Trump stakes.
A
Trump stakes. You know, there's. And there is a skill to selling the sizzle without no protein. Now, I'm sorry, I. Was this supposed to be nonpartisan? I'm doing a wonderful job so far. The, the, the, the crypto thing aside, like, obviously what I'm hearing from you is that it's more of an ethics issue than it is an emoluments issue. In other words, like, I understand that, that all of you folks that have the title Attorney General, at least on the Democrat side of things, are friends, you know, you know each other. And so is there any talk about, like, you know, telling the administration that it's not okay in a legal arena?
B
Well, I'll, I'll let you in on our process. We meet very regularly. You're right. All the AGs who are Democrats, at least, are friends. I mean, I'm friends with some of the Republicans, too. Some of them are great. But in addition to that, our staffs are all working together and in conversation with each other. So they are looking at all kinds of issues, whether it be through executive orders or other problems that have arisen this administration so far, which have been, unfortunately, numerous. And we're analyzing largely whether federal law has been violated or the Constitution has been violated, and whether our states have standing. Standing, as a refresher means that you are a party who can bring a lawsuit. So if someone ran over my foot bow, you probably would not be able to sue on that issue. It would be me. I would be the one who would have standing. Right. So you, your state has to be someone with standing. And if those, especially the Constitution. You know, when I literally swore an oath to uphold the Constitution of the United States of America. That's right. In the oath. And I am going to do it every time. It's taken up a lot of my time these past six months, but. Right. That's what we're looking for when we are looking about whether we should join a lawsuit.
A
An unfettered marketplace of entrepreneurs doesn't seem like a great match for the ethical complexities we face when it comes to artificial intelligence, among other things. We're going to take a quick break and when we come back, we're going to talk about what states like Vermont can do when tech moves faster in the law.
B
Quick choose a meal deal with McValue. The $5 McChicken meal deal, the $6 McDouble meal deal, or the new $7 Daily Double meal deal, each with its own small fries, drink and four piece of McNuggets. There's actually no rush.
A
I'm just excited for McDonald's for a limited time only. Prices and participation may vary. Not by Alder McDelivery. The process you described, the legal groundwork, the need, constitutional lens, all that, that feels especially important right now because some of the biggest threats we're facing aren't just political, they're structural. You mentioned earlier that AI is one of the issues that keeps you up at night. Me too. The technology is moving faster than the law. At one point this year, the budget bill included a 10 year ban on state level AI regulation. It didn't survive, but the fact that it was even proposed says a lot. So in this legal vacuum, I think you could call it that. What should states be doing?
B
It's honestly such a mess. You know, we have such a complicated moment when it comes to AI. We have corporations who are racing to be the company who kind of owns the space, the leader. And they're willing to take risks and violate laws because they gotta get there. And we have leaders in Europe always when it comes to data privacy and technology because of their experience historically and politically, I think a lot can be learned about what's going to be happening in Europe with the AI Act. And certain states are leaders as well and have information that is kind of unique to their arena. So I'll give you an example. Obviously in Vermont, we have an attorney general, me, who was a consumer lawyer in our consumer unit earlier on in my career. And so I have a very strong interest in data privacy and will always be proposing data privacy bills and ideas to the legislature as an example. One of those was to ensure that our revenge porn statute included AI. So if someone made a deep fake pornography with some old girlfriend's face on it, that would be included in our revenge porn bill lot. So stuff like that. But I mean, that's just the, the you. The challenge with AI is that it can be found in so many different arenas. A comprehensive bill is almost hard to imagine because we're still figuring out how AI can be applied you know, so ideally we would have a federal law. Ideally it would be comprehensive, the way I think Europe is leading the way on. Um, and then we have laws that were passed a long time ago in. You know, I think that there have been visionaries in Illinois when it comes to biometric data, that I'm always very focused on biometric data and when it comes to AI, how biometric data can be used for such meaningful and impactful and devastating ends. Deep fake pornography being, you know, top of mind in that regard. But who knows what's on the horizon when it comes to things like, you know, DNA, eye scans, all of it. So I think that there, it's. It's really hard to know where to begin and where to end when it comes to AI. We just have to be open to the moment and also make sure that our legislators are educated. I think if I were, you know, if my whole job was focused on policy related to AI, my first starting point would be to make sure that legislators in state legislatures were educated about AI, about what it means, how it works, what are the applications and what are the concerns and pitfalls that have been identified, you know, in other countries and other states by thought leaders or even like in science fiction. I mean, sometimes it's easier to understand in a science fiction context what the concerns might be than in a sort of boring application in the real world,
A
you know, because it's so expansive and hard to predict all the places in which AI can be implemented, because it's quite, quite a large swath of our reality right now. How about focusing on impact and saying that we need to start thinking about guardrails when the technology places a person in danger, when the technology creates a peril for a person, place, or thing, you know, and because it could be anything but, like, how about that? How about just on. What do you call that in legal terms when there's damages, When. When someone's run over your phone?
B
Damages. Damages. It's literally the word.
A
So I'm. I've been around, you know, I've been around. I know words, but no words. So. So like with. With AI, let's not focus on what it can do, but who it hurts. And then if we do that, is that. Is that a. Is that an acceptable way to start framing some guardrails?
B
I mean, maybe I can. I can I posit another approach. What if it's starting at the end, the damages? We start at the beginning. The philosophy. We believe in privacy, believe people have a right to privacy. Part of the challenge with that is we have a moment in time when we have major Supreme Court decisions that were based, were based on a right to privacy, and they could have been based on the equal protection rights, but instead they were based on a right to privacy. Those involving sex, those involving gay marriage and abortion. You know, we have this idea of privacy that's so important, but we also have, unfortunately, a conservative wing of the Republican Party who is willing to sacrifice privacy for their social ideas. You know, and that's what I, I wondered if is connected where we. It's. It's. So we have that problem. The other problem with the philosophy is I feel like in capitalist America, we love you, but sometimes the idea of the economic opportunity, the ability to make money, et cetera. Trump's concerns about privacy. We can't go slow just because you want your privacy. We have to own the field because if we don't, China will or whatever the, the threat is.
A
Yeah, the move fast and break things mentality. That. So, Yes, I think that that is true. Like, it is, first and foremost, it is about there's a Venn diagram of, like, the circle that says we should do everything right. And, and there is a right and a wrong way to do it. And then there's the other circle that says that we need to win. And the need to win circle is like a death star in front of a tiny, eensy, weensy little moon of good intentions. So we're in that situation, right, where this big death star of aspirational world dominance in the field of artificial intelligence is just beating out all considerations of caution. And so you have everything from people predicting a doomsday caused by AI that decides it doesn't need us anymore a la the Matrix, to an AI that simply treats us all as fodder for getting better at what it does. And that all by itself. And so it worries me that our data could be used to train some, you know, and that's. And that is the Hollywood science fiction version of it, because that's like. I'm not saying Elon Musk did anything wrong. I'm not saying Donald Trump did anything wrong. I don't know. I wasn't there. What I do know is that our data, all of it ends up being sucked into this machine that is learning everything we know and learning how to tie it to what other people know to create a different version of what is known.
B
The other thing that's so great about, you know, Hollywood science fiction and reading science fiction is that one of the things that, about AI that I think is problematic is a lack of imagination among us, and artists are able to imagine these crazy scenarios. And. And it does open up a creative portion of our own minds to say, wait, could this go wrong? Maybe this isn't such a great thing. And I think one of the things that we should be looking at is what is the purpose of AI? Is it to make our lives easier? Because we've been hearing for generations that the new technology is going to make our lives easier. But look around. Don't you think people are working longer hours and harder than they ever have? Are they making more money for that work per hour? You know, technology has not made our lives easier. It's made capitalism better. It's helped very rich people, mostly men, become richer. Right. So I think that's another question is, you know, what is. What is the good. I. I'm not. I don't mean to diminish the concept that technology, if we own the field, gives us an edge, and that has a, like, national security concern. I think that that is valid. I think that's a very valid perspective. If we don't occupy the field, China will, you know, not our friend. Right.
A
Yeah.
B
I don't mean to. To diminish that, but I think that there are sort of niche concerns that I have that from my experience and professional life seem so obvious to me that people aren't talking about. I'm going to highlight one of them. I see this coming a million miles away, and I hope that five years from now, we're not pulling up this podcast and be like Charity. Such a Cassandra, she warned everybody. But the romance scam using AI chatbots is a deadly combination, and it is coming. You know, the idea that I could literally be living my life, making a sandwich, doing whatever I want, but my. My evil AI bots could be romancing victims by the thousands. Extremely easy to imagine that happening.
A
Yes, it's true. And it's been. And it's been happening.
B
It has been happening.
A
But, you know, on the romance scam front, we have inter. We actually interviewed one of the guys, Charity, whose face has been used in lots and lots of scams.
B
Like a hot man who's been used in romance scams.
A
Yeah. And he's just all over the place.
B
That's. That's just awful. I. Oh, gosh, talk about a violation of privacy.
A
Exactly. Right. We live in a world now where you can go to a Coldplay concert and just be minding your own business and pop up on the Jumbotron. Not saying that you're pure as the driven snow. You may be having an affair, but did you expect to be placed on the jumbotron and then go public like that? Like, oh, by the way, we're having an affair to the entire world.
B
I know.
A
And so here's the question I have is, you know, that's a great example of the question of standing. That was the CEO of a company and the director of that company's HR on a date, both of them married to other people, both of their lives torn apart because their pictures were placed on a screen. Somebody on a social media platform reposted it, and other people used AI, used facial recognition AI to identify those people and used data that was online that identified those people and said where they lived, who they were married to, where they worked. Talk to me about what we can do on the legal front to protect the right of American citizens who want to have an affair without being discovered through social media in front of the entire universe?
B
This is a great example because the stakes feel very low, like, this isn't the end of the world. You know, it's just, you know, two. Two individuals. This, to me, is filed squarely in the none of my business department. And I. This comes up a lot when I talk about data privacy, but I really believe it. The Vermont motto is freedom and unity. And my joke is that that's translated into the modern world as love your neighbor and mind your own business. So this is in the mind your own business department. It's none of your business what these. Unless it's your husband or your parent or whatever. And even then, if you're an adult, it's none of your business. They get to live their lives. Right. That's why, to me, it starts with kind of like the philosophy, the preamble, the whereas statements that begin a piece of legislation, you know, whereas mind your own business. Whereas everybody has a right to be free in America. You know, and the thing about it that's so frustrating is I think there is this. It needs to be named. There is this dynamic where people unwittingly share of themselves online, and then a company takes that information and uses it however they want, and somehow it's the person's fault. Well, you share this picture online. Well, everyone's sharing their picture online. And Facebook, and of course, Facebook has had a very spotty history with protecting our privacy, and that's a whole nother matter. But so I think that there's this. This rule where we find ourselves and very far down the path of, you know, unwittingly being A partner to this change in the ethos of privacy in America. And we have a moment where we, especially with the advent of AI, where we have to say, no, this is what it means to be free of, from. Free from surveillance. Just, I, you know what I always think about, did you watch 30 Rock? I'm sure you did.
A
I did, yes.
B
And of course there's this. All the characters are so great, but Jack Donaghy is this media, you know, very fancy media person in New York City. And one of the things about him that's hilarious is he has a cookie jar collection. Very surprising for this media mogul in New York City. And I think about that all the time because that's the kind of thing that who cares? But it might be kind of embarrassing to him. You know, that character might feel very embarrassed about his dorky cookie jar collection, but that's his business. And if he wants to have be free and weird with his cookie jar collection, okay, great. That's none of our business. And it deserves, he deserves to have privacy over that.
A
So the idea that your life, your quirks, your mistakes, even your cookie jar should be private, I don't know. That doesn't just apply to public figures or adults. It's showing up in a much quieter, more everyday place for children, specifically with the apps that they use to turn in homework, check grades, message teachers, and all that. And if you don't have children, you might not realize how early this kind of tracking starts, but it really, it might matter a lot. That's after the break.
B
It's not just something you made. It's the privilege that you get to work with your hands. It's building something that serves a purpose, proof that you have the grit to keep going. At Timberland, we understand you take your craft seriously. And we do, too, which is why our products are built to the highest quality.
A
We put in the work so you
B
can perfect yours with purpose, in every detail, and crafted with intention. Timberland built on craft. Visit timberland.com to shop. This episode is brought to you by Greenlight. You can't solve every case for your kids, but with Greenlight, they'll have the instincts and money skills to stay out of trouble. With a Greenlight debit card and money app, parents can monitor spending and teach financial responsibility. Educate your kids as they grow from earning allowance and tracking chores to learning how to save and invest. Start your risk free Greenlight trial today@greenlight.com Spotify.
A
Up to this point, we've been talking about crypto, AI surveillance data brokers and the philosophical right to be left alone. You know, mind your own business. But what happens when that debate isn't just theoretical, when it starts showing up in permission slips, classroom maps, and seemingly gives no other choice than for our children? You know, here's the app. Use it. What happens when the battle over privacy begins before a child even understands what privacy means? So, Attorney General Charity Clark. I love saying that. What do parents need to understand about the choices they're making when it comes to their children's data?
B
So if you are a parent, it is a good time to check in with what feels right for you, because schools are not there to be monitoring this. They're there to make, to teach children and make their life easier. So they're using edtech tools, edtech products to make their lives easier. That might not match up with your philosophy about privacy. It's okay to opt out of that stuff. You know, it's okay to ask for, as I have done, the data privacy statements on various apps that the school is using. Because this next generation is going to be really interesting. We are of an interesting generation. I consider myself to be Xennial. I'm a young Gen Xer. And I think that I am in a unique generation because I had the Internet and email when I was in college. I had the, obviously the Internet a lot when I was in law school. And yet I grew up with, in a, you know, in a free and analog world. I always reference this and I'm doing it again. But there's this iconic, in my own mind, New Yorker spoof article about summer before the Internet, about the things that we did. And it's the visual. Like, the illustrator just drew a person, like, lying upside down in a chair blowing a bubble. And it's like, yeah. And the whole article is so gold because it's like, yeah, that's what life was like. So to be in this generation, to understand what it is, what. What it was like to be truly free and. And to see now how different it is. I teach a class, or used to teach a class at the University of Vermont. And I remember before class, a bunch of my students were chatting about the. The cool party they went to over the weekend. And the reason why the party was so cool is because everyone had to put their phones in a bowl in the entryway, and then they were like, free and no one was taking pictures and they were unsurveilled. And it just blew my mind that that was like, my entire college experience, you know, so my idea of privacy is going to Be different. And I think kids who are little today are going to be so much more sophisticated than millennials about privacy because they're going to have parents like me freaking out over, you know, all of the data privacy concerns that we should be having as parents today.
A
Now, if. If you're not familiar with the education pointed apps, they are several, and they have actually a storied past of being very bad on the privacy front. And there are laws. It's one of the few places in this country where there are laws that protect the information coppa of. Of children. And yet it is a porous environment right now. And I can tell you, because I was a rather active parent back in the day with my kids who were grown, that you can ask that the school backpack, any information you need, and they will do it. They have to. They can't force you to be on those apps because the apps are now used the way that backpacks used to be used, where they would just be given a memo and then your kid forgot to give it to you. So you had no idea what was going on. It's no different now because most parents aren't even looking at those apps. They're just looking to see if their kids are going to, you know, doing their painting in their home.
B
That is 100 true. Something that I'll tell you is we passed a law in Vermont right before the pandemic, I think. And in the course of advocating for that in the. From the office, I learned that the, the. One of the, the weak links is parents waving their rights. It's like the violation of privacy has been so normalized that people think they're being like a Karen, if they say, no, I don't want that information shared about my kid, you know, or like me, I get the form and it's like, what's your kid's Social Security number? Like, why do you need it? I'm not giving it to you. I'm making it up. I'm constantly. My daughter and I just signed up for something yesterday. I can't remember what it was. And she wanted to watch a movie and we had to sign up. And she was just like, what should we put as our birthday? She just knows we make up a date. I'm like, why am I sharing my birthday with Paramount?
A
Plus that instinct to make something up, to not give the real birthday to question why anyone needs your Social Security number or your children's number. That's not paranoia. That's just a baseline defense mechanism now. And the wild part is it's often seen as overreacting, which is crazy. We've internalized this idea that resisting a privacy overreach makes you a problem when really it just makes you smart. But here's the thing. Once that line gets crossed enough times, we stop seeing it, right? It's like, it's like when a room smells bad. Like after a while you stop smelling it, but it doesn't mean it doesn't stink.
B
I mean, remember when I remember maybe 15 or 20 years ago, the first time that I was checking my Gmail and I was getting ads on the side related to stuff in my emails. And I remember I was living in New York City, but of course I'm from Vermont and I'm obviously low key, obsessed with Vermont and always have been. So even when I was in New York, I was probably talking about Vermont a lot.
A
So low key.
B
So I was getting a lot of like maple syrup ads and B Bs and stuff. And then I remember someone in my family, a young child in my family got head lice and we were all together and there was like a panic on the family. You know, email chains around, like headlights. And I was getting ads for like headlights, like cream or whatever, shampoo. And I remember being so creeped out about that. And now it's just what we expect. And in fact it's what we sign up for. When I go on Instagram, Instagram is selling me stuff and I buy it, you know, And I think there is a place for that. And then I think there is a limit. And what we're doing right now in 2025 is we're deciding where the right place is. Where's the limit?
A
That's a good way of putting it.
B
What I want is for people to have the awareness of how much their data is being used, how it's being used, and so they can decide for themselves where the limit is. A lot of people just don't even understand what's happening, that they're being manipulated, that they're being sold things that they want to buy. Great. But there is a line. And I think that's where I start to be troubled is when places like TikTok or AI chatbots are manipulating people into making decisions, voting in a certain way, buying certain things they otherwise might not buy. And I think we as a society should be having the conversation about what guardrails need to be put in place to protect people and their sense of freedom and privacy.
A
Talk to me for a second about what is going on in the state of Vermont when it comes to privacy or the ways in which privacy, or lack thereof, can affect people. What's happening on the legal front for you?
B
So I first advocated for a biometric privacy act, a biometric Data privacy act, in early 2020, like, out of my own mouth to a legislative committee. That committee has taken years to thoughtfully approach the issue, educate itself, and in that process has, as the kids say, like, radicalized itself. I mean, they literally are so informed that they have very strong opinions now about how important our data privacy is and what a law should look like. And ironically, it's kind of slowed down the process because they know so much. They're not willing to. To compromise on certain elements. It's really impressive. I should note that that committee is chaired by a Republican. And I mean, traditionally Republicans were like this in Vermont. They still are. They value privacy. They don't believe in giant big government, knowing all your business. So I'm not totally surprised that it's chaired by a Republican. But I have been really impressed with that committee. One of the challenges has been big tech has spent a lot of money trying to kill privacy bills or shape them in the way that they want to across the country. And Vermont has a citizen legislature with no staff for each person. Doesn't have an office, they don't have a staff. If they're lucky, they might get an intern. And so they rely on the good faith of lobbyists to be truthful with them. What you will see, especially I think it was last year, a bill passed the House, passed the Senate, and was vetoed by the governor. When it went back to the legislature, the senate failed to override the veto. And you can watch, you can get it on YouTube. Today on the senate floor, senators who are good people saying things about the bill that literally just weren't true, like, they were told things to be afraid of that were literally not even in the bill. And I was watching it, like, tearing my hair out, like, you're worried about that? I have great news that's not even in the bill. Like, yay, now you can vote to override. And so they failed to override the governor's veto, and we still don't have a comprehensive data privacy bill in place. And meanwhile, as technology develops, as AI gets more sophisticated, more widespread, more than ever, I think we need more protection. So that's kind of the status of things in Vermont.
A
Right. So what's your wish list for the next year on that? On the privacy front for Vermont?
B
I would love to see protections, specific protections, because we, of course, we have The Consumer Protection act, but it's, it's a broad act. So I would like specific protections to protect our biometric data, data about ourselves. We can't change our DNA, our fingerprints, our face, our eyes, stuff like that. In addition, I'm ready for movement on AI. I feel like there's a lot of other elements of a data privacy bill that we've looked at in the past that we've been in favor of in past, but the truth is we sort of missed our window because now we have to spend our time on bigger things. And to me, that's AI. I think that there is a lack of imagination when it comes to some of the challenges with AI. And conveniently, other places have created some examples for us to use. And that's always a nice thing to see. Oh, what they do in this state or in this other country or federalist system like the eu, you know, that's a nice place to look and say, is that working for them? You know, how are we the same? How are we different? How are our values the same or different? So continuing the education so that we can be thinking about meaningful protections for AI.
A
Let's say that an attorney general decided, I don't want this job anymore. So like, I, I'm going to throw the next election. How about just making all these social media platforms that are wreaking havoc on our privacy banned in a state. How about just saying, you can't do business in this state if you're going to be doing what you're doing. And why isn't. And, and that's the thing is, like, we're all implicit in it. I'm gonna just like, paint the picture for you charity. And I want to hear your response because I don't think there's an answer and I don't think that that's the right move. The Coldplay couple that got identified, they were identified by other citizens. They weren't identified by their family, like you pointed out. They were just. We're all, I think, participants in this destruction of our own privacy.
B
Yes.
A
In our own privacy. So let's say, like, I mean, if you want to talk about passing a lot, or like, ban, ban social media, like, I'll, I guess I'll help you lose your job. But the, but the, but like, how about something probably more doable, which is getting people to, to understand what the stakes are when they do post their information and post things about their lives.
B
I think that's a much better approach. I mean, especially, I'll say here in Vermont we don't have a lot of newspapers left. And so Facebook actually serves as an important media source for people. I mean, every community has their own kind of Facebook and if you want to reach that community. I was advertising on Facebook when I ran for office in 2022 the first time because I knew there were certain communities, they didn't have a newspaper. That's where people got the news. So they actually serve a great purpose in the, in the process. I think I don't agree with the choices that they have made regarding privacy, but you nailed it when you said that, that we are participants in the, you know, destruction, removal, risk of our own privacy. And that's what I want to stop. I just think, how can you be free if you're being surveilled? And the thing about it is we allow private companies to do things we would never tolerate our government doing and then they sell their products to the government to surveil us. You know, I just don't agree with that. I feel like people should just go out in the world and be free and they should be able to go to a concert and not feel like they're being surveilled, you know, and that could just be like getting narcked on by their fellow concert goers. Like everybody should just mind their own business and be free.
A
Charity Clark, Vermont's Attorney General, thank you so much for joining us.
B
Thanks for having me. It's been fun.
A
And now it's time for our tinfoil swan, our paranoid takeaway to keep you safe on and offline. So if you are worried about your privacy and you don't know if you should be opting in to scheduling apps for your children, schedule as one of them. There are many pupil path. You need to get in touch with the administrators of your school. Now you may think this is going to cause problems and it might because it does create an extra layer of work. But read up on what these student tracking apps and the student scheduling apps and the student grade apps are all the same thing. Read up on, on their privacy policies. They're not great. Now you have the right to not participate in the defeat of your kids privacy right. You do. Call your school up, ask them to backpack notes. That is the most important thing you can do to give your child some agency in the entire lifeline of their privacy. Because if they lose it when they're young, before they have the ability to say yes or no, before they have the ability to understand it really is that fair? Stay safe out there and have a great week. What the Hack is produced by Beau Friedlander. That's me and Andrew Stephen, who also edits the show. What the Hack is brought to you by Deleteme Deleteme makes it quick and easy and safe to remove your personal data online and was recently named the number one pick by New York Times Wireless Cutter for Personal information Removal. You can learn more about Deleteme if you go to joindeleteme.com wth that's joindeleteme.com WTH and if you sign up there on that landing page, you will get a 20% discount. I kid you not, a 20% discount. So yes, color me fishing. But it's worth it.
B
Curious about the future of Healthcare? Tomorrow's Cure, the chart topping and Ambi Award finalist podcast from Mayo Clinic brings it to you today. I'm Kathy Werzer and in this new season I sit down with researchers, doctors and industry experts who are leading the way in medical innovation. From cutting edge technology to breakthrough treatments, we'll explore how new solutions are improving and even saving lives. Follow Tomorrow's Cure wherever you listen to podcasts. What is healthy spirituality and how does it help us thrive? We explore these questions on the new season of with for hosted by me, Dr. Pam King. Within Four Bridges psychology and spiritual wisdom to help you thrive, featuring conversations with experts like Self Compassion pioneer Kristin Neff and author activist Parker Palmer. So go ahead, follow within four, hosted by Dr. Pam King. Wherever you get your podcasts.
Host: Beau Friedlander (A)
Guest: Vermont Attorney General Charity Clark (B)
Date: February 24, 2026
In this episode, "What the Hack?" explores the intersections of surveillance, privacy, and the expanding influence of data collection—both government and corporate. With Attorney General Charity Clark of Vermont, the discussion unpacks not just the technicalities, but the core values, risks, and responsibilities that underpin privacy in 2026. The team highlights the normalization of privacy violations, the rapid advancement of AI, the ethics around cryptocurrency, and the special vulnerabilities of children within current educational tech systems.
[03:25]
“Do we believe that each of us has a right to privacy?...That I think is a part of America.”
— Charity Clark ([03:28])
[04:34]
“You're gonna let this person have access to all of our highly sensitive data? It did not instill confidence...”
— Charity Clark ([05:24])
“There's a reason why cryptocurrency is most used right now … for three things. So speculators … scams … [and] buying stuff on the dark Web.”
— Charity Clark ([08:22])
[06:32]
"... that's not a me thing. That's a you thing ...the other one is artificial intelligence."
— Charity Clark ([06:55])
[11:58], [12:42]
"It's honestly such a mess... corporations ... are willing to take risks and violate laws because they gotta get there."
— Charity Clark ([12:42])
[16:40], [17:03]
“What if ... we start at the beginning, the philosophy. We believe in privacy, believe people have a right to privacy.”
— Charity Clark ([17:03])
[20:08], [21:35]
“The romance scam using AI chatbots is a deadly combination, and it is coming.”
— Charity Clark ([21:38])
[24:31], [26:36], [27:18]
“That’s why, to me, it starts with, kind of, the philosophy... 'whereas mind your own business.'”
— Charity Clark ([24:31])
[29:29], [32:54]
“It’s okay to ask for… the data privacy statements on various apps the school is using.”
— Charity Clark ([29:47]) "The violation of privacy has been so normalized that people think they're being like a Karen if they say, no, I don't want that information shared about my kid..."
— Charity Clark ([32:54])
[34:17], [35:30]
“Now it’s just what we expect. And in fact, it’s what we sign up for…”
— Charity Clark ([34:44])
[36:36], [39:09]
"...big tech has spent a lot of money trying to kill privacy bills..."
— Charity Clark ([37:56])
“I would love to see specific protections to protect our biometric data…In addition, I’m ready for movement on AI.”
— Charity Clark ([39:09])
[41:20]
“We are participants in the destruction, removal, risk of our own privacy.”
— Charity Clark ([41:47])
This is an incisive yet approachable discussion, balancing rigorous legal insights with relatable stories and humor. The speakers urge listeners to reconsider the normalization of privacy violations, push back against digital fatalism, and advocate for both collective and individual action—especially in defending the privacy of the next generation. The episode is laced with wry observations and practical takeaways, making a complex and often bleak subject both accessible and actionable.