
Loading summary
Beau Friedlander
Do you feel observed?
Jay Stanley
Watched?
Beau Friedlander
There's nothing new about surveillance.
Ben Jordan
What's up, Chief? Why the Magenta alert, 86? I'm deeply concerned about the conference this afternoon. We have reason to believe that control may have been infiltrated. I'll need some special equipment for that. Chief, we can talk under the cone
Beau Friedlander
of silence, but it has been in the news a lot lately.
Hiba Balfak
Chilling images from the night of Nancy Guthrie's disappearance.
Ben Jordan
Authorities say tips critical element for law enforcement of defense Pete Hegseth gave Anthropic a Friday deadline to grant the US Military unrestricted access to Claude. Now the key friction point. Anthropic does not want its technology used for autonomous weapons or the mass surveillance of Americans.
Beau Friedlander
Surveillance used to be about spycraft, the James Bond version of surveillance. Like there's a microphone somewhere in the house, there's a camera hidden in a wall, and. And it's maybe behind the pupil of a painting. You know, like it's. It's all cliches. Some of it's true. Some cliches are true because they happen a lot. But now we're just all being spied on. Everyone maybe. You keep talking about how you need a new power tool, and the next thing you know, you're getting served ads for power tools. And you haven't googled it. Nothing. It's just. They just start popping up. Digital surveillance as it exists today started before anyone even called it the surveillance economy. It started with cookies and trackers. There's no such thing as total anonymity. Philip K. Dick once said there will come a time when it isn't they're spying on me through my phone anymore. Eventually it'll just be my phone is spying on me. That time is now. French researcher Yves dumontjoy works with huge data sets to demonstrate how easy it is to re identify anonymized information, like where you're based on pings sent to your mobile phone.
Ben Jordan
On average, knowing four places and times where someone was is enough to uniquely identify him and potentially re identify him in one of these data sets of like, you know, 1.5 million people or like four points is sufficient to uniquely identify someone.
Beau Friedlander
95% of the time, they only needed four points. Four pings. You were here, here, here, and here to identify you. Not maybe you. You. You've seen your data on the people search sites, right? It's the tip of the iceberg, honestly speaking.
Ben Jordan
The amount of credentials is easy to find. Or if you don't feel like finding them, if you've been on the dark web long enough you could buy them.
Beau Friedlander
Now, we say make yourself as hard to hit, hard to spy on, hard to collect data on as possible. But what does that look like now that there are surveillance cameras everywhere? I mean, 2009, Google CEO Eric Schmidt
Ben Jordan
said, if you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.
Beau Friedlander
Wait, like, like British street artist Banksy, for example? Because like, Banksy wouldn't be Banksy if you could see Banksy doing what Banksy does. Now, in 2010, Banksy actually did make a new piece featuring a pink TV with the words reversing Andy Warhol's quip on 15 minutes of fame. It said in stenciled letters, in the future, everyone will be anonymous for 15 minutes. Yeah, that's not happening anymore. Welcome to the first installment of a two parter on surveillance. Today we're going to talk to Jason Kebler of 404 Media, musician, composer and YouTuber Ben Jordan and Jay Stanley from the ACLU about the new surveillance. I'm Beau Friedlander and this is what the hack, the show that asks, in a world where your data is everywhere, how do you stay safe online? Now, if you don't know 404 media, fix that right now. It's, it's some of the best independent tech journalism out there covering surveillance, AI and, and the stories legacy outlets won't touch. You may have missed their super bowl ad.
Jason Kebler
Yeah, we bought a, a regional local ad.
Jay Stanley
Yeah.
Beau Friedlander
Jason Kebler, co founder of 404 Media and former editor in chief of Vice's motherboard. To be honest, I don't think that I would wear a hoodie from any other media outlet, but I totally like one of those 401 media hoodies. Just saying thank you.
Jason Kebler
That means a lot trying to become a fashion brand on the side.
Beau Friedlander
So about that super bowl ad.
Ben Jordan
In a world increasingly controlled by big tech AI slop and social media algorithms controlling what you see, Four4Media focuses on something else. Real information you can actually use.
Jason Kebler
We did have a Super bowl ad. It was, it was very much a stunt that I was inspired by the Verge doing something similar like 10 years ago. And basically like local TV stations that air the super bowl are allowed to sell their own ads that run only on their channel. And so we found the smallest media market in the United States, which was in this town called Ottumwa, Iowa. And we put the ad on YouTube and wrote an article about it and the process and all that. So, I mean, a lot of people did end up seeing it seemed to like the idea behind it. But this was not like a major super. It wasn't like it was a stunt. More than anything, I guess I'd say
Beau Friedlander
the reason I asked you about your super bowl ad was there's. There's another super bowl ad for. For the Ring cameras that. That freaked a lot of people out. It was about their search party feature. Pets are family, but every year 10 million go missing. And the way we look for them hasn't changed in years.
Jason Kebler
Yeah. So search party is a feature that Ring launched back at the end of September, I believe. It is a feature that uses AI to link Ring cameras in a neighborhood or a town altogether to look for lost dogs.
Beau Friedlander
One post of a dog's photo in the Ring app starts outdoor cameras looking for a match. Search party from Ring uses AI to help.
Jason Kebler
And so basically, if you lose your dog, you can take a picture of the dog, you can upload it to Ring's website, and then all of these cameras will be, like, automatically networked together and use AI to look for the dogs. And when this was launched in September, it was. It was somewhat controversial, but didn't get that much attention. But then they did this huge super bowl ad. It had CEO Jamie Siminoff sort of talking about it. And, you know, they used it to find this dog and the ad. And I think one of the images that they used was a visual of, like, all the cameras kind of searching together, like networking together with like these blue overlays in a kind of scary looking way. And I don't know, I feel like there was just like a massive backlash to this.
Beau Friedlander
I mean, when I saw those cameras, the first thought I had was, oh, God, what if that was Ahmaud Arbery? And I mean, honestly, the very first thought I had was, that's not what it's going to be used for. So there was this backlash. And I guess the reason I thought that is because I know Ring has a relationship with law enforcement.
Jason Kebler
Yeah, it's really interesting and I mean, it's a little bit of a complicated backstory, but this guy, Jamie Siminoff, the founder of Ring, he launched it on Shark Tank.
Ben Jordan
First into the Tank is a product to ensure you always know who's at your front door.
Jason Kebler
He proposed it on Shark Tank was the origin story of Ring.
Beau Friedlander
My name is Jamie Siminoff. I'm from Los Angeles, California. My product is the doorbot. I'm seeking $700,000 for a 10% stake in the company.
Jason Kebler
And famously, it didn't get Funded. Okay, Jamie, it's that moment when I
Ben Jordan
say, you're dead to me because you
Jay Stanley
don't want to take my offer.
Ben Jordan
I made you a very valid offer.
Jason Kebler
I think, under the circumstances,
Beau Friedlander
respectfully, Mr. Wonderful, we're gonna decline.
Jason Kebler
But basically after that, the way that Ring became popular was they entered into these partnerships with police all over the country, and they went basically town by town, and they incentivized police to pitch these doorbell cameras to people who lived in their towns. So they would give police free cameras and say, like, give these away in sweepstakes or at public meetings. They gave them, like, discount codes. And in return, as part of this partnership, Ring basically offered police the ability to request footage from anyone in that town. And this was like, 2017, 2018. And we tracked these partnerships at the time back in my old job, and it went from, oh, there's like, a few dozen places doing this, to like, thousands within a year or so. Thousands of police, thousands of police departments. So just basically thousands of towns, like, signed up for this and were. Were kind of doing, like, Ring's marketing for them. And. And Ring became, like, very popular during this time. This did spark, like, some sort of backlash, like the Electronic Frontier foundation, the aclu. There was, like, tons of negative reporting about Ring and about how they were kind of building this, like, consumerist surveillance state, more or less. And. And there were kind of a lot of high profile cases where, like, paired with Ring at the time. I mean, they still have this. It's an app called Neighbors, but. And it is basically like a place that Ring camera owners can upload footage and then other people can comment on it. What ended up happening was like, anytime a quote unquote, suspicious person would walk by a Ring camera, they would be recorded and this footage would be uploaded. And then people in the town would be like, what's this person doing here? And very often that would be a black person or a Latino person or like a delivery driver or someone doing their job. It became this kind of like, mini police state. I would say there was a big backlash to this eventually. And Ring sort of took a step back from this at some point, like 2022, 2023. And they. They canceled the program where they will allow people to give footage to police without a warrant. I don't know how impactful it was that these partnerships were canceled. But at that same time, Jamie Siminoff, the founder, left Ring. He left the company entirely. And then he came back last year.
Beau Friedlander
Yeah, 2024, he leaves, comes back in April of 25. It's now owned by Amazon.
Jason Kebler
Yeah, he was basically like, daddy's home. We're going back to our mission. We're pro cop, we're anti crime, we're doing the partnerships again. And critically he was like, we're actually going to add AI to all of this. We're going to try to automate the entire process. And that is sort of like where things stand now. This is explicitly like pro police surveillance technology and they are trying to figure out how to implement AI and network them together.
Beau Friedlander
That's the shift. We're not talking about a doorbell camera anymore. We're talking about a privately owned AI assisted surveillance network with the potential to scale nationally. And it's run by a founder who just came back with a very particular political vision and the resources to do for law enforcement what, at least in his opinion and the opinion of people like him, law enforcement has been unable to do for itself. That sounds like hubris to me. But anyway, the real danger here that is isn't neutral technology. It's a powerful tool in the hands of people with a very specific idea about how it should be. Here's Jay Stanley, senior policy Analyst with the ACLU's Speech Privacy and Technology Project. Jay Stanley, thank you so much for joining us. My first question is why is ACLU interested in surveillance?
Jay Stanley
At the end of the day, surveillance is a question of power and freedom. And when people surveil you, they have power over you.
Beau Friedlander
So what about Ring super bowl ad?
Jay Stanley
Yeah, I mean, I think that the super bowl ad was a wake up call for a lot of people about just how powerful centralized cloud video services are. This is not like the old fashioned way back when camera from like the Obama administration. We now, because of AI, because of cloud centralization, any camera that's tied into that system is much more powerful than it used to be, including ring cameras. Because AI, if you have a, you know, you have a pool of 10,000 hours of video, it used to take 10,000 hours or maybe 5,000 or 2,000 man hours to search that. But now, just like large text corpuses can be keyword searched. You know, you can ask an AI, find me somebody in a red sweatshirt who's carrying a briefcase and it will, it will find those people for you. And that's what the dog, you know, search technology was based on. But of course, everybody intuitively knows that it's spooky because in five minutes, and in fact Ring company even said this in internal memos, they're going to use it for other things. They said, we're going to use this to end crime in America. And you know, the amount of surveillance that would be required to, quote, unquote, end crime is frightening to most people, and rightly so.
Beau Friedlander
What is the danger of a private sector, you know, a business having access to this much information about this many people?
Jay Stanley
Yeah. I mean, it, this is mass surveillance, which is the biggest, most concerning form of surveillance because it collects data on everybody all the time without individualized suspicion of wrongdoing, which is the standard that the government normally needs to meet to invade your privacy.
Beau Friedlander
Simonov recently, I think it was recently said, you know, you can now see a future where we are able to see. He's talking about search party. We're able to zero out crime in neighborhoods. Zero out crime in neighborhoods. This is a private sec. This is not a, this is not law enforcement. This is a company selling a camera and surveillance equipment saying they're gonna zero out crime. Crime. Who's saying what the crime is? Cause my concern is like, you do go back to this thing where these viral videos with people just deciding someone's committed a crime and the whole neighborhood is participating and prosecuting it.
Jason Kebler
Yeah. I find this language to be actually very important to talk about. So I'm glad you brought it up. Which. Yeah. One, it's like, what types of things are being criminalized? It's like, by and large, like why this technology was created was for package theft, which costs Amazon, I don't know, probably billions of dollars a year. But also like, what is, what does zero out crime mean? Because it's not going to zero out. Tax fraud, it's not going to zero out. Domestic violence, it's not going to insider trading. Yeah. All of these sorts of things. It's not going to zero out. But like, the goal is to eliminate crime that homeowners believe is like possibly a nuisance or things that they don't want to see in their neighborhoods. Yeah.
Beau Friedlander
That keeps them so that they don't have to pull out their little dumb guns.
Jason Kebler
Exactly, exactly. And, and like how, how that would even work is, is very unclear because at least where I live, it's like the police don't really have time to be dealing with some of these lower level crimes. It's like, I don't know, call 911 because your window's been smashed and they'll be like, we're not coming. Like we're too busy. We're going to come some other time. We're dealing with like murders and shit.
Beau Friedlander
Yeah. Some of you listening might still be Wondering what the problem is. Sure, it's not perfect, but surely stopping more crime is better than stopping less crime. I mean, we have to do something about the Bernie Madoffs of the world. This isn't about Bernie Madoff. This is about zeroing out in quotation marks, a certain kind of crime. Street level, visible, the kind that gets you on the local news. Not fraud, not corruption, not the crimes that never get caught on camera because they. They happen behind closed doors. Who decides what counts? People with no constitutional guardrails. And. And the answer shifts with whoever's in power right now. The prevailing winds favor installing powerful new surveillance tools.
Ben Jordan
And.
Beau Friedlander
And once they're in place, that's it. You better get used to them. Forget porch pirates and lost dogs. Nobody's going to stop and ask, should we be doing this once it's all in place?
Ben Jordan
Common argument is like, I don't have anything to hide.
Beau Friedlander
That's Ben Jordan. He's a genius, not to put too fine a point on it.
Ben Jordan
And people have said that in person to me, and, and I. And I've asked them, well, well, unlock your phone and give it to me. I'm gonna go in the other room for a bit, like. And, well, no, I'm not gonna. Okay, so you do have something to hide. What are you saying? Did you have, like, child abuse material on your phone? What are you saying? Because that. This is the ridiculous argument that we're having when somebody wants privacy.
Beau Friedlander
You forgot to say, I promise not to take any money.
Ben Jordan
Yeah, right. Yeah, I promise not take any money away.
Beau Friedlander
Ben Jordan, he records his music as the flashbulb, among other aliases. But you may also know him from YouTube, where he does deep dives on technology, privacy, and surveillance and amazing experiments like converting an image into a sound wave and then training his rescue starling. Yes, the rescue bird. To memorize and sing the sound wave back. 176 kilobytes of retrievable data stored in a bird's brain.
Ben Jordan
But the reality is, is you have a ton of things to hide. Like. Like where your belongings are in your house, you're hiding. You generally don't want the general public to know that. You don't want the public to know the password to, you know, where your secret key is under the rock, or the. Or your passwords or your.
Beau Friedlander
Or that. Or that you were against the war on Iraq or whatever.
Ben Jordan
Yeah, I mean, but, yeah, generally it's like. It's that I have nothing to hide is like, such a privileged statement from someone who's never had their identity stolen. For somebody who's never been stalked, for somebody who's never been, who's never been sued for, who's never had to deal with discovery, for like, there's so many different scenarios where, where something that you thought was private ends up getting into the wrong hands and used against you, like. Cause that's just the reality we live in now. And so whenever somebody says that, I, I just want them to think like long and hard about the things that they very much have to hide and that they might not even realize. Like if, you know, you might be talking about having a, a chronic pain issue or something like that on social media, guess what? That ends up being put into open source intelligence which is accessed by insurance brokers. Like that's that that ends up costing you down the line.
Beau Friedlander
So Ben reached out to Jason and 404 Media to tell them what he had learned about some of these surveillance cameras. Not Ring but Flock Safety, which makes an array of different products. Some of them are just license plate readers and some of them go way beyond that into AI assisted stuff. And that is why I reached out to him. But to be honest, I also wanted to know about his farm situation. There. There seems to be an animal in his lap in every single YouTube video he posts.
Ben Jordan
I have like 24 chickens. I believe the numbers go up and down as like, you know, there's too many roosters and then there'll be a hawk attack and then, you know, things like that.
Beau Friedlander
I haven't gotten chickens for that reason because there's some coyotes that live just over there. And I feel like I just don't want to be in a Hanna Barbera cartoon for the rest of my life.
Ben Jordan
It's funny how often like the two worlds of hacking and, or surveillance and chicken protection blend for me.
Beau Friedlander
So where are we in this privacy nightmare? Is it over? Are we post privacy or are we just in the throes of pre digital privacy? Just haven't figured it out yet.
Ben Jordan
I mean, everything gets to a breaking point, right? Like everything eventually gets there. And I mean, like, what would happen if I used my platform to popularize a browser plugin that for example, we had 10 people and we all swapped cookies, like, for example, like data tracking cookies. So like, let's say, you know, a woman named Maggie was another user. I don't know who she is, but some of her details are coming over to my browser and some of mine are going there. And now she's going on Instagram and now she's going to start getting Ads for things that are targeted towards me. And then all of a sudden, ads on Instagram aren't going to work as. And now companies are going to be mad because their conversion rates are lowering and their, their return on investment is going way down because people's data is all jumbled up. This is just like off the top of my head. And I think that in a lot of cases that, that's what generally seems to happen in things like this. Like, there's always sort of a middle ground because otherwise we would have cameras installed in our house, you know, from whoever, Google, Twitter, you know, X. Like, we would, it would be a lot worse.
Beau Friedlander
Well, we do already have cameras from Amazon in our homes and, and, and, and, and microphones.
Ben Jordan
Yeah, that is true.
Beau Friedlander
So we actually talked about, we actually talked about this with Al Franken. Witnesses are Alexa, Amazon.
Ben Jordan
Is that right?
Beau Friedlander
Just Alexa.
Jay Stanley
Siri, as I understand it, how can I help you?
Jason Kebler
Well, for now, be quiet.
Ben Jordan
Okay, Alexa, you too.
Beau Friedlander
It's just another way we were frogs slowly getting boiled. If you want to hear that whole episode, which is awesome. And then a throwback to when there was a three host roster here on the show, we're going to drop it on Thursday as a bonus episode. You know the old phrase, money doesn't know where it came from? Well, for a while, data didn't know what it was good for. Websites were like, do you mind if we gather this or that, you know, stuff about you? And people were like, yeah, yeah, sure, whatever. I just want to use the service. People posted, did their thing and all the data that they generated didn't really mean anything yet. Well, to them. Think of a kitchen that looks like there's nothing to eat in it, right? Maybe there's some spices or whatever, and there's some oil, a few vegetables, stuff in the pantry, but nothing really good. Now, a chef, a competent chef walking into that situation is going to be like, are you kidding me? I can make all kinds of things for you. What do you want? That's what Big Data did with our information, with all the stuff scraped from social media and, and all the information that we plugged into sites thinking it didn't matter. Well, it did matter. Now that situation is in the physical world and it's not just on your smartphones, meta glasses, or like a face computer, basically, that can take pictures and tell you what you're looking at. It's AI assisted. So, sure, we all had that collective cringe moment with the super bowl with its search party ad, because it wasn't about Dogs. If you're like me, you're like, you know what else that could be used for all kinds of things.
Ben Jordan
I'm so glad that, like, the general public took that away from that ad.
Beau Friedlander
Here's Ben Jordan again.
Ben Jordan
I'm really proud of society or something like that because I feel like had you shown me that ad before it ran, I would have been like, oh, man, people are just going to be like, no, not my dog going missing. That's my worst nightmare. And then just sort of, you know, appealed to it. But yeah, it seemed like generally everybody was made uncomfortable or, you know, the. The vast majority of people were made uncomfortable by that ad and started asking more questions like, what are these capable of?
Beau Friedlander
Now, a surveillance camera is useless if there's not a person on the other end looking. A mic, same thing, a tap, a wiretap. It doesn't work if someone's not on the other end recording it or listening. We're now in an age where all of that can be automated. And not just automated, but the understanding of what is being collected can also be automated. And it can be looked at with intelligence that is agentic, that is not human. Right. That is powered by artificial intelligence and therefore can be done very fast. And that is a game changer. So what we're really interested in is how surveillance works online and start to think about how it jumps from being just a part of the marketing machine of the Internet to back to good old fashioned or good old scary Big Brother style surveillance. Because it wasn't that for a while, when Meadow was doing it, it wasn't about Big Brother, it was about big sales. And now it's gone from like, oh, we can use all this data to sell people stuff. We can use all this data to figure out what you want to. Now, like, we can use all this data to control people, which was what Cambridge Analytica was about. Right. For example. And now we can use all this information to control people's behavior. How? By putting cameras everywhere. Ben, you made a series of videos about Flock safety cameras. Flock sells a range of public safety cameras.
Jay Stanley
Right.
Beau Friedlander
Their basic license plate readers aren't AI cameras. While some of their other models include AI powered analytics, there's some that can hear somebody, you know, screaming, gunshot, all kinds of stuff. So we're going to get more into the specifics next week. But in your research and reporting with 404 Media, in one of your videos, there's a part of that video that just blew me away. And when you started to say that's where you were also brought to tears. I was like, oh, come on, shut up. But yeah, that's this. When I see this dude walk into a playground and there's a camera trained on a playground already. Super problematic because these cameras are hackable.
Ben Jordan
Yeah. Like, why?
Beau Friedlander
And an older. A person who is not a child walks onto the playground and gets on a swing and starts swinging. And you ask the question, would he or most adults feel free enough to do that if they knew there was a camera taping them?
Ben Jordan
And that kind of opens up this, this whole thing. Because, I mean, that, first of all, that's something that I do. Like. I, I mean, everybody likes going on a swing set, but I love a good swing. And an adult man walking onto a swing set at a playground by himself when other people are around is creepy and unacceptable socially. So you don't do it when you're
Beau Friedlander
being watched unless you have. Unless you happen to be there with someone small.
Ben Jordan
You're related to kids. Yeah. So. And so, you know, they either. I have the option of. Of doing it to be socially acceptable. I. I have the option of doing it with. When nobody's around and nobody's looking. Or I could, I guess I could, like, get a fake child and, like, a little doll and just bring it, dress up one of my dolls.
Beau Friedlander
Come here, Bob.
Ben Jordan
Yeah. And so, and so, you know, it's reasonable to assume that had this person known that that camera was watching him and that I could watch him swing 30 days later, 31 days later, that they would have said, no, I'm not going to do this. I'm not going to be. I'm not going to be seen swinging in a playground. But this is also, like, we do a lot of stupid things when we're not being watched. We sing, we practice accents, we try cartwheels. But a lot of not stupid things, like, we learn to play the guitar. We solder for the first time. We. We code for the first time. Like, there's so many things that we do that, that we need privacy for. Like, we need privacy to explore ourselves and to really find the weird, quirky parts of our personality and refine them before showing them to other people. And I mean, we're just, you know, humans are supposedly a social species, so we, we need to feel accepted from other people. And so we, we need to try these things out with privacy. And cameras rob us of that. And I think that that's something that, like, it had, it's, like, studied in the corporate world where it's like, okay, so if you Have a little Caesars. Is it better to put five surveillance cameras on the area where they're putting the pasta sauce on the dough? Or is it better to just leave them alone and let them do it? And it turns out that in a lot of research, and, I mean, I don't like I have direct citations for this. There's plenty of it, though, that if you just let them do it themselves, they'll figure out their own way of doing it and they'll actually provide. Not to get, like, too Marxist here, but they'll actually provide organic labor or labor that is something that they have worked on themselves to refine, to work for them, but possibly even be learned from. And then there's, you know, there's like the concrete labor where you're just telling somebody instructions like they're a computer or something, and they just do it and they hate their life. And so, I mean, it's. It's when you have a surveillance camera on somebody at work, they are generally going to try to appear to work hard all the time rather than trying new things and improving. Honestly, if you were to secretly record me working on music, a music session, and then watch my stream every Thursday where I check out music software and make music, I swear on that stream, I've never. I've been doing it for years. I have hundreds of streams at this point built up. And I don't think I've made one piece of music that I'm happy with ever. Like, not even starting a project. It's always been garbage. And it's because I'm being watched. It's because, like, every single click that I make, you know, could be scrutinized. And even though I have, like, the kindest community and my streaming channel is almost hidden, like, it's hard to find on purpose.
Beau Friedlander
Ben, can you just tell me, like, pretend I don't know anything. What's the Hawthorne Effect?
Ben Jordan
I mean, Hawthorne effect is. Is just simply, simply stated, people behave differently when they're being observed or when they're being surveilled, I guess. So the initial experiment, which, I mean, the initial Hawthorne effect experiments were pretty dodgy. Like, that's not really great research, but that's sort of what intrigued people initially. But. But it is a real thing where if you put. If I have you assemble. If I have you order something from Amazon, a piece of furniture and assemble it by yourself, you're going to do it differently. If I put a camera over you and you know that I'm watching you assemble it, even if I'M not grading you, even if I'm not like going to hire you to build more things for me. Like just knowing that you're being watched, I mean, it takes a certain amount of mental energy. I think the, the crazier research papers, and they're more recent ones that I've read. Like the, the one that absolutely blew my mind was that in areas with high levels of surveillance, people were less likely to recognize faces. Which is like, there's gotta be a weird, you know, but when you think about it, it's like, yeah, because when you're, when you have a bunch of surveillance cameras everywhere, you feel like you're in a hostile environment. And when you're in a hostile environment, you're not gonna be as likely to remember faces because they're not friends, they're foes. You're in a place where you feel like you're gonna have more foes than friends. Which like, you know, goes back to like our, our visceral psyche where we have like friend, foe, you know, things like that. And so in social identity theory and stuff. So it's like it actually makes a lot of sense. Like, yeah, the brain's not going to waste energy remembering faces if it feels like it's, you know, not with its kin or something.
Beau Friedlander
We're living in a world where almost everything you do can be observed, recorded, uploaded to the cloud, analyzed by AI and routed wherever someone decides it should go for reasons good or bad, and usually reasons you don't know anything about. Are you still comfortable walking into your local library and checking out a book? Or do you read it in the stacks instead because you don't want a record of what you're reading sent to who knows where? Are you sure there aren't cameras in the library trained on the stacks like they can zoom in? Are you even thinking about it?
Jay Stanley
We shouldn't have to live that way. It has the potential to inhibit us in ways that we don't even know.
Beau Friedlander
Here's the ACLU's Jay Stanley again, showing
Jay Stanley
up to protest voting, perhaps assembling with other people for various reasons. And we think that people should have the right to exercise their first amendment rights to self expression, to assembly without feeling chilled, without feeling they're being watched all the time. You know, you may be the most sterling, free, pure snow person in the world, but if you're driving along the highway and a police cruiser pulls in behind you, most people get kind of nervous. You don't like the feeling of having police right behind them. And I think that same feeling can be created by drones overhead, by license scanners, by constant video surveillance. And we shouldn't have to live that way. Nobody wants to be watched.
Beau Friedlander
So how do these things slide into our lives and become permanent? Because we're going to opt in. We're not even going to notice we're doing it. Just hit a button because we're not paying attention. Something pops up, you know, you got a camera, I don't know, maybe some coyote slinking around your, Your barn. If you have a barn, maybe somebody keeps stealing packages from your, your front porch. Whatever it is, you're not going to be paying attention because all this stuff just happens at the speed of life and they're working at the speed of what's legal to get an adoption on the other side of that. That equation. And that's where I think we're going to get into trouble.
Ben Jordan
So a lot of people have gotten this Ben Jordan again, I have because I left ring installed on one of my tablets. And it was like a tablet I use for telescopes. And I, like, opened it up and I had a ring notification that had a picture of this, like five days ago. Had a picture of a dog that looked kind of like, strikingly a lot like one of my dogs saying, we found that this lost dog was found on your property. Can you, can you blah, blah, blah, like, enable this feature to see it? And I, I looked at, I. To see the full picture, I would have needed to agree to terms and conditions and update and blah, blah, blah. But just looking at the thumbnail, I realized it wasn't my property and it wasn't any property around me. Like, it just didn't look like it was even in the region where I'm at. And it definitely wasn't my dog. And yeah, nothing illegal is going on, but they, they're trying to trick you to agree to terms.
Beau Friedlander
Was it a ring email or was it a hacker?
Ben Jordan
No, it was ring. It was definitely.
Beau Friedlander
It was.
Ben Jordan
Yeah, a lot of people have gotten it too. And it's. Yeah. And then people click it and then they, and then it says, oh, you got to agree these new terms. And they say, yeah, yeah, yeah, whatever. Okay, I got to see. I got to see if my dog's missing. You know, I got to see if this dog's on my property, what's going on here. And it, and it's complete bullshit. It's just made up. It's just like, it's just an ad to open the app and agree to new terms of service and then agree to allow the dog tracking whatever it is that they're trying to do. And but I mean that's a great example of like are they your friend or are they your foe? Because they're literally trying to trick you to agree to something that they know you're not going to read so they can advance the platform to get more of your data, which makes them more money.
Beau Friedlander
So while we slide into becoming used to this technology being everywhere, there is another problem too. We're sliding away from the rule of law as determined by our peers and elected representatives. Private companies are beginning to be the arbiter of, of what's what in, in the uber weird scape of for profit law enforcement.
Jason Kebler
A lot of these companies, Ring included, are hiding behind the idea that we live in a democracy and that we have laws. And so what they are saying is like, well, we're just following the law.
Beau Friedlander
Here's Jason Kebler again.
Jason Kebler
We're providing this technology to police or you know, in Ring's case, it's like they're selling the technology to consumers but then it's going to police. And they're saying, well it's up to the police basically to like use this stuff lawfully. But we, you know, we respect the police and we want to help them and that's all fair and good kind of. But they have built this incredibly powerful technology that police buy access to and it's like unclear whether if the, if the government were to build systems like this, like whether they would be lawful, but because they're buying access to something that someone else else is doing, it sort of sidesteps the law.
Jay Stanley
Yeah, I mean we're seeing private companies being put in the middle of law enforcement in a way that's never happened before in history. JAY Stanley police department in 1970, you know, probably bought their flashlights from a private company and this and that. But now we have companies like Flock and Axon saying that they want to provide the quote unquote operating system of police departments. We have companies like Palantir working very, very closely with the federal agencies as well as, you know, other companies like Axon. And there is a danger that it's an end run around the Constitution because not only the Constitution, but other checks and balances like FOIA and open records laws, the Privacy act don't apply to private companies. You can't foia, Flock or Axon to find out what their internal memos are about and policies are. And there is a law called the Fourth Amendment is Not for Sale act that has been proposed that would basically ban law enforcement from, you know, there's a lot of very personal information about you that they could never get with a warrant because they don't have a suspicion of you. And since they can't get it with a warrant, they just go buy it from these data brokers. And the data brokers are basically doing what the East German Stasi used to do, which is compile as much information in dossiers about as many people as possible. They're doing it for commercial, not political reasons, but they're sitting there and the government goes to them. Does a complete end run around the Constitution. There is a proposed law called the Fourth Amendment is Not for Sale act that would ban law enforcement agencies from doing that. And it actually passed the House of Representatives. It has bipartisan support, it died in the Senate. But we and others are still pushing for that. And that's in part a part of a solution here.
Jason Kebler
And then the other thing is that our privacy laws are so outdated. They're just, they're just really not set up to kind of grapple with what is happening. And I guess, like to be specific about that, you know, you can take pictures of anyone that you want if you're in public. And, you know, you're allowed to put cameras on your property and face them out into public and take pictures. And I'm not disputing that. I think it's a core part of the First Amendment. And like journalists use this all the time. Like otherwise a lot of what we would do would be criminalized. But what's happened now is that all of these cameras have been networked together and they've been added to a database. So it's not just like they're taking a picture or a video of a moment in time. They're constantly filming and they're going into a database and they're analyzing that. And then they're being networked together. And like very complex maps of your movements and your activity are being built. And that was unimaginable when these privacy laws were written. And so, and we haven't really, I feel grappled with whether they're still adequate these days.
Beau Friedlander
Well, and whether people are aware of the fact, because most people aren't reading the privacy policies or user agreements to the products that they're using if they're aware of the fact that they may be involved in the, in the, the hunting down of a person. Because a search party, another name for a search party is a posse. And it feels that way sometimes. Like people are being unwittingly dragged into these posses of surveillance. And, you know, so. And we don't have privacy laws that would protect us from it. GDPR would protect against this somewhat, wouldn't it?
Jason Kebler
Yeah, I think that it would. I think GDPR is a start. There's been some state laws that have really limited what types of crimes can be investigated using some of these technologies. Like Illinois, for example, has a law in the books that says, you know, their license plate readers can't be used to enforce immigration. There's a law in Colorado that's being considered right now, or a bill that would prevent it from being used for abortion enforcement. And I think that these are good starts. But the other thing you mentioned is that consumers don't read the terms of service. And that is objectively true. There's been studies that show consumers don't read it. But why would you, first of all. And second of all, you can't alter these terms of service. You either have to agree to them or you don't get to use the product or own the product. And it's not just consumers who are not reading these things, though. It's like our reporting has shown that police and towns are buying this technology and then they are violating the laws of their own state. In the, in the case of Illinois, the one I just mentioned, because their license plate readers are, are, or were being used to do immigration enforcement. And they don't even know that their, that their cameras are being used for this purpose because they're just not sophisticated about, like, how this technology is being used, what the terms actually say, what the law actually is. And then, you know, we exposed like dozens of police departments in Illinois violating the state law. And what happened, like a slap on the wrist. Nothing really happened because of it.
Beau Friedlander
Look, I don't know what the constellations of this new world of surveillance for profit is going to look like, but we know it's going to evolve. It's just starting. And it's a really important thing to pay attention to and learn about. And if you listen to this episode and you did learn something, pass it on because it's super important. And come back next week because part two is going to focus on flock safety in particular. We're doing a deep dive. Now. It's time for the Tinfoil Swan, your paranoid takeaway to keep you safe on and offline. Your home assistant is listening right now. That's not a bug, that's a feature. It has to listen locally on the device to catch its wake cord, Siri or Alexa or whatever. That part stays on the device. The problem is false triggers, and they happen all the time. Your TV says something that sounds like Alexa, your kid says something that sounds like hey Google. Or they say, seriously. And suddenly audio you never meant to send anywhere is going to the cloud. And once it's there by default, you've agreed somewhere in the fine print, at least with Google, that a sample of those recordings can be reviewed by contractors to improve the product. Alexa quietly removed the Opt out for that in 2025. Draw your own conclusions. Here's what you can do. Stop them from saving your voice recordings. Opt out of product improvement programs where they exist. That includes LLMs if you're using them, and audit what third party apps have access to on your assistant. The full step by step Alexa Siri Google Guide to opting out from these things is on our website@joindeleteme.com podcast find this episode it's all there thanks to Sarah Heward. 20 minutes. Your home assistant goes from surveillance device to dumb speaker that sets timers for you and stuff. This episode of what the Hack was produced by me and Andrew Steven, who also did the editing. Our theme music is by Andrew Steven. If you think you heard Ben Jordan's music in the mix, you're right. There's some other stuff, but there's some Ben Jordan too. Check him out on Bandcamp or wherever you get your stuff. What the heck is a production of Delete Me, which was picked by the New York Times wire cutter as the number one personal information removal service? You should be using it already. If you're not and you want to, well, you can. Here's what to do. Go to joindeleteme.com wth that's joindeleteme.com WTH and get 20% off. I kid you not. 20%. 20% off. That's joindeleteme.Com WTH now. Stay safe out there. See you around and come back next week. We're going to be talking about Flock Safety.
Hiba Balfak
Not all darkness is dangerous. Sometimes it's the doorway to becoming whole. On the brand new podcast the Shadow Sessions, hosted by me, Hiba Balfak, a psychologist and trauma expert, we shed light on the hidden corner corners of the human experience through raw, unfiltered conversations from the edge of healing, the Shadow Sessions invites you to do the deeper work that leads to real change. Follow the Shadow Sessions wherever you're listening now.
Episode 241: Surveillance In America, Pt 1: Somebody's Watching You
Date: March 3, 2026
Host: Beau Friedlander (DeleteMe)
Guests: Jason Kebler (404 Media), Ben Jordan (musician & YouTuber), Jay Stanley (ACLU)
In the first part of a two-part series, "What the Hack?" investigates the explosive growth of surveillance technologies in America—moving from classic spycraft to advanced, AI-powered systems embedded in everyday life. The discussion dives into the implications of “mass surveillance for profit,” the convergence of private tech companies with law enforcement, what citizens lose when constant surveillance becomes the norm, and what we can do to reclaim some measure of privacy.
Modern surveillance has shifted from secret microphones and hidden cameras to widespread digital surveillance and AI analysis of personal data.
Beau Friedlander notes how digital surveillance predated even the phrase "surveillance economy," beginning with cookies and trackers online:
"There's no such thing as total anonymity... Philip K. Dick once said there will come a time when it isn't they're spying on me through my phone anymore. Eventually, it'll just be my phone is spying on me. That time is now." (01:00)
Even so-called "anonymized" data can be easily re-identified:
Ben Jordan: "On average, knowing four places and times where someone was is enough to uniquely identify him... Four points is sufficient." (02:03)
Jason Kebler recounts Ring’s rise, police partnerships, and the slippery evolution into AI-driven, police-friendly surveillance:
"They incentivized police to pitch these doorbell cameras... giving them free cameras, discount codes... and police could request footage from anyone in that town." (08:15)
After a period of public backlash and program changes, Ring’s founder left and then returned—doubling down on “pro-cop” messaging and AI integration.
The Super Bowl ad for Ring’s “Search Party” feature, which networks cameras to find lost pets, caused a mass public unease:
Beau Friedlander: "The first thought I had was, oh God, what if that was Ahmaud Arbery?... That's not what it's going to be used for." (07:08)
AI supercharges surveillance by allowing instant, mass-scale data analysis—changing what’s possible:
Jay Stanley (ACLU): "AI, if you have a pool of 10,000 hours of video... now you can ask an AI, 'find me somebody in a red sweatshirt who's carrying a briefcase,' and it will find those people for you." (12:29)
The narrative that these tools can "zero out crime" is misleading and dangerous:
Jason Kebler: "What does 'zero out crime' mean? It’s not going to zero out tax fraud, domestic violence, insider trading... But it eliminates crime that homeowners believe is a nuisance." (14:53)
The “I have nothing to hide” argument is dismantled:
Ben Jordan: "If you have nothing to hide, unlock your phone and give it to me... Well, no, I'm not gonna. Okay, so you do have something to hide." (17:32)
Surveillance chills creativity, self-exploration, and even the mundane joys of daily life, like swinging in the park:
Ben Jordan: "We need privacy to explore ourselves... Cameras rob us of that." (28:07)
The Hawthorne Effect:
"People behave differently when they're being observed or when they're being surveilled... Even if I'm not grading you, just knowing you’re being watched takes mental energy." (30:57)
As private companies become de facto arbiters of surveillance, constitutional protections are eroded:
Jay Stanley: "Private companies are being put in the middle of law enforcement in a way that's never happened before in history... It’s an end run around the Constitution because the Privacy Act and FOIA don't apply." (37:55)
Outdated privacy laws can't handle the new realities of interconnected, constant, AI-driven data collection.
Even law enforcement agencies often deploy these technologies ignorantly, sometimes breaking their own state laws with little consequence.
Consumer awareness is low; terms and privacy policies are unread, unmodifiable, and enforced only via take-it-or-leave-it agreements.
Users are often manipulated into opting in—sometimes via misleading notifications designed to encourage consent for expanded data collection.
Ben Jordan: "They're trying to trick you to agree to something that they know you're not going to read so they can advance the platform to get more of your data, which makes them more money." (35:50)
There’s some movement towards legal remedies (e.g., GDPR in the EU; “Fourth Amendment is Not for Sale Act” in US Congress).
Beau Friedlander:
"Surveillance used to be about spycraft... Now we're just all being spied on. Everyone maybe." (00:48)
Jay Stanley (ACLU):
"At the end of the day, surveillance is a question of power and freedom. When people surveil you, they have power over you." (12:18)
"Nobody wants to be watched." (34:19)
Ben Jordan:
"I have nothing to hide is like, such a privileged statement from someone who's never had their identity stolen... There's so many different scenarios where something that you thought was private ends up getting into the wrong hands." (18:48)
Jason Kebler:
"All of these cameras have been networked together... being added to a database... mapping your movements. That was unimaginable when these privacy laws were written." (39:37)
The episode makes clear that the expansion of surveillance—by private companies, often acting as proxies for law enforcement—poses profound threats to privacy, free expression, and the basic freedoms Americans take for granted. The technology, enabled by AI and cloud networking, is evolving faster than our legal and social norms. And often, the most meaningful “opt out” tools are hidden in lengthy terms and conditions people never read—or even actively circumvented via manipulative prompts.
Next week’s episode will dive deeper into another key player in this system: Flock Safety.
For detailed steps and more resources from this episode, visit joindeleteme.com/podcast.