The 404 Media Podcast — Episode Summary
Episode Title: How Benn Jordan Discovered Flock's Cameras Were Left Streaming to the Internet
Published: January 12, 2026
Host: 404 Media
Guest: Benn Jordan (YouTuber, investigative techie)
Overview
This episode dives into the alarming discovery by YouTuber Benn Jordan, who found that Flock’s "Condor" surveillance cameras—a product of the automated license plate reader company—were live-streaming video, unencrypted and without password protection, directly to the public internet. The conversation explores how this vulnerability was found, what it reveals about Flock's handling of security, and wider issues of mass surveillance, privacy, and accountability.
Key Discussion Points and Insights
1. Background of the Flock Camera Incident
- Initial Discovery: Benn Jordan, known for his deep-dive tech and science YouTube content, stumbled upon live, unsecured feeds from Flock’s Condor cameras while using Shodan, a search engine for internet-connected devices.
- Nature of Cameras: Flock’s “Condor” cameras are advanced PTZ (pan-tilt-zoom) devices, designed not just for license plate reading, but specifically for tracking people (00:04–02:30).
- Locations: Exposed cameras were found across the U.S. at malls, playgrounds, skate parks, and bike paths—often in places where people expect privacy (02:30-03:45).
“We found between 40 and 60 of these live streaming throughout the United States. We've seen them stationed at playgrounds, we've seen them stationed at malls, We've seen them stationed on bike paths.” — 404 Media Host (03:10)
2. Flock’s Response
- Flock characterized the exposure as an "isolated configuration issue"—a “troubleshooting only debug interface" temporarily left public, with content "comparable to what can be observed from a public roadway."
- Both journalists and Jordan push back: the actual data available was far more sensitive (03:50-04:40).
3. Benn Jordan’s Motivation & Perspective
- Background: Jordan’s channel pivoted from sound synthesis to broader tech and surveillance issues after seeing how pervasive Flock cameras are in Atlanta (05:10–06:01).
- On Privacy: Jordan describes himself as a "left libertarian":
“I truly do believe that privacy is a human right ... it is there at the very fiber of our ability to live out our own destiny.” — Benn Jordan (06:20)
4. Viral Awareness and Misrepresentation
- Jordan’s and 404 Media's work on Flock has gone viral on social media platforms, sometimes in inaccurate summaries.
- There’s shared concern about details being misconstrued and the company's aggressive legal tactics with critics (08:13-10:47).
"The bigger concern is that this company is extremely litigious and kind of malicious, in my opinion." — Benn Jordan (08:48)
5. How the Vulnerability Was Found
- Late-night Shodan searches led to the discovery of streams and admin panels for Condor cameras, accessible with no login and controllable by anyone (11:02–12:45).
- Jordan shared findings with researcher John Gainsack, who then found even more exposed cameras.
“We were just seeing everything from playgrounds to parking lots ... Honestly, we probably saw more stuff that didn't have cars in it than stuff that did.” — Benn Jordan (12:47)
6. Human Impact and Surveillance Banality
- The hosts and Jordan reflect on the unsettling experience of being able to surveil ordinary people in everyday settings, such as a man enjoying a swing at a playground, unaware he was being watched (13:38–16:47).
- Jordan recounts visiting the bike path and playground in person, feeling "immediately scared" by the dystopian reality of AI-driven tracking (14:20–15:50).
“It created this rift with me… Like, if I lived nearby, I wouldn’t go on the bike path, period. This is creepy.” — Benn Jordan (15:07)
“It showed what a person does when they have an expectation of privacy. … that person, had he known that anybody was actively watching him, it's doubtful that he would have enjoyed himself in that moment.” — Benn Jordan (16:21)
7. AI-Driven Surveillance vs. Traditional Policing
- The group discusses how constant, AI-organized surveillance is a leap beyond traditional policing. AI enables tracking everyone by default, which could lead to discriminatory outcomes and over-policing in monitored areas (17:48–20:04).
“Having everybody's activity in a giant ... data storage thing that's being organized by AI versus having the police have a suspect and getting a warrant ... it's easy to see how quickly this gets out of control.” — Benn Jordan (18:00)
8. Security Failings and Worsening Issues
- Access included not only live streams, but also admin controls, logs, stored footage, and the ability to delete or manipulate sensitive data.
- Jordan: “I don't know how it could really be any worse, to be honest” (25:25–26:45).
- Cameras run on Android OS—overpowered and less secure than required—should be replaced with custom, minimal systems (27:00-28:07).
- Jordan suspects Flock is aware of these issues but is “just pushing on until they can have an IPO,” then deal with fallout.
"I honestly can't think of any way where it could be worse ... most cameras, in my opinion, need to be replaced ... with something that's not running Android." — Benn Jordan (26:55)
9. Creating Accountability: What Would it Take?
- Scope of vulnerability is national, affecting everything from mall-goers to children at playgrounds.
- Both agree: if these flaws are visible through legal research tools like Shodan, it’s almost certain that malicious actors have found them too (33:51–36:07).
- Laws prevent reporting on even deeper vulnerabilities, meaning the public only learns the “tip of the iceberg.”
“If you see a security vulnerability on my video or if you hear about it generally in the press, anything like that, chances are that things are a lot worse, because I can only tell you that about things that I can legally access.” — Benn Jordan (34:52)
Notable Quotes & Memorable Moments
- “This Flock camera leak is like Netflix for stalkers.” — Ben Jordan (paraphrased from YouTube, referenced at 03:55)
- “It was kind of funny because I was recording ... I was reading Flock’s statements contradicting exactly what you would be seeing with your own eyes watching me read it.” — Ben Jordan (31:41)
- “What’s it going to take for somebody to stand up to this? ... The answer I don’t want to hear is it’s going to take a child getting abducted or a woman being assaulted. … We could see the risk right here.” — Ben Jordan (33:51)
Timestamps for Key Segments
- 00:04–02:30 — Introduction to the Flock Condor camera live stream vulnerability
- 03:10 — Scope and examples of exposed cameras across the U.S.
- 05:10–07:03 — Ben Jordan’s motivation and personal story
- 08:13–10:47 — Impact of viral coverage, concerns with misrepresentation, and Flock’s legal threats
- 11:02–12:45 — Jordan describes how he found the open cameras and what he saw
- 13:38–17:48 — Human impact: personal stories, fear, and the reality of being watched
- 17:48–20:04 — AI surveillance vs. human policing: dangers and social effects
- 25:25–28:07 — Technical vulnerabilities & Flock’s inadequate response
- 31:41–33:51 — Closing the loop: Experiencing the creepiness firsthand, and the need for stronger response
- 34:52–36:07 — Why public disclosure is just the tip of the iceberg
Tone and Closing Thoughts
The tone throughout is deeply concerned, at times incredulous and exasperated, but always rooted in a drive for public understanding and accountability. Both the host and Ben Jordan stress the urgency of better privacy protections, regulatory oversight, and transparency in the face of rapidly expanding, AI-driven surveillance systems.
Further Resources:
- Ben Jordan on YouTube
- 404 Media Flock reporting: 404media.co
End of summary.
