Front Burner Presents: Deepfake Porn Empire
Episode 1: The Dawn of Fake Porn
Date: April 6, 2026
Host: Sam Cole (Reporter, 404 Media, Guest Host of Understood)
Featured Voices: Cutie Cinderella (Streamer), Walter Schreier (Engineering Professor, Notre Dame), Ian Goodfellow (AI Researcher), TMFU (Anonymous Photoshop Porn Creator), Lux Luker / Carrie Pearson (Early Fake Porn Community Founder)
Main Theme
The episode traces the origins and explosive growth of non-consensual deepfake porn—a billion-click industry built on AI-generated images and stolen identities, chiefly of women. Host Sam Cole chronicles the technical, cultural, and social history of "fake porn," from 19th-century photographic trickery to today's AI-powered deepfakes, focusing on the story of streamer Cutie Cinderella as emblematic of the human toll—and the systemic indifference—surrounding this evolving online abuse.
Key Discussion Points & Insights
1. Personal Fallout: The Cutie Cinderella Incident
[01:26–05:06]
- Cutie Cinderella wakes up to find herself trending on Twitter after a fellow streamer, Atriok, accidentally exposes tabs of deepfake porn—including fakes of women he knows—on his live stream.
- The images go viral across Discord, Reddit, and Twitter, flooding Cutie's inbox before she even grasps what’s happening.
- Cutie describes the shock:
- "It is so convincingly my body, but not my body. And holy shit, it hits you like a truck. You feel so violated." – Cutie [04:54]
- The episode frames Cutie’s story as a turning point in public awareness of deepfake porn’s victim impact.
2. A Brief History of Fake Photos and Digital Manipulation
[06:55–10:33]
- Walter Schreier contextualizes fakery, dating back to the earliest days of photography:
- "Basically, as soon as the camera is Invented in the 19th century, people are faking their photos." – Schreier [06:55]
- Examples:
- Hippolyte Bayard fakes his own death for publicity in 1840.
- The Cottingley Fairies (1917) deceive even Sherlock Holmes’ creator, Arthur Conan Doyle.
- The digital era begins with Photoshop (1990), democratizing powerful image manipulation—immediately co-opted for pornography.
3. Predecessors to AI: The Photoshop Porn Subculture
[10:33–14:41]
- The 1990s-2000s: Usenet groups and early websites cultivate communities sharing celebrity nude “fakes,” led by figures like Lux Luker (Carrie Pearson).
- "We all love seeing the heads of some of our favorite stars and other prominent people placed on top of a nude model. The sheer godlike power of exposing them to the world for our own fantasies..." – Lux Luker [12:35]
- These forums established rules against minors—but also chilling lists of girls’ birthdays to know when they become “eligible.”
4. Hollywood, CGI, and the Blurring of Reality
[15:04–16:39]
- Public trust in images further erodes with advances in CGI: films like Jurassic Park (1993) and Furious 7 (2015, Paul Walker facial replacement) unsettle ideas about what’s real vs. digital.
5. The Technical Breakthrough: Generative Adversarial Networks (GANs)
[18:31–22:03]
- In 2014, Montreal grad student Ian Goodfellow invents GANs, allowing AI to generate photorealistic images:
- "The generator makes images that aren't real at all, and the discriminator doesn't know what's real or fake... Eventually, the generator learns to make very realistic images.” – Goodfellow [20:47]
- Goodfellow hopes GANs will solve scientific problems—not realizing their transformative (and sinister) potential for porn.
6. Deepfakes Emerge: The Pornification of GANs
[22:16–25:38]
- Open-sourcing GANs code enables rapid, uncontrolled diffusion—Reddit user “deepfakes” posts the first AI-generated celebrity porn videos (Gal Gadot, Scarlett Johansson, etc.) in late 2017.
- Sam Cole recalls first breaking this story. The tech world shrugs:
- "Within the computer vision community, Deepfakes comes out. No one really wants to comment on that ... it's not our problem." – Schreier [25:29]
7. Public Panic: The Wrong Deepfake Problem
[25:52–28:25]
- Authorities and tech media focus on political deepfakes (“fake Obama”/Jordan Peele, Zelenskyy surrendering, Trump nuclear threat)—while deepfake porn is ignored, despite being the overwhelming majority of usage.
- "People were paying attention to deep fakes, just not porn." – Sam Cole [25:52]
- "Some people are genuinely fooled. Right. Which is. Which is a problem. Right. But for the most part... they're almost always released on, like, an anonymous account..." – Schreier [28:04]
8. Deepfake Porn: The Human Cost
[28:25–34:51]
- For victims, what counts is the violation, not the technical unreality.
- "This is what pain looks like. This is what it looks like. The Internet, the constant exploitation and objectification of women. It's exhausting. It's exhausting." – Cutie Cinderella [30:12]
- Cutie faces relentless harassment, resurfaced trauma, family shame, and persistent online abuse. She explains the long-term impact and the futility of control:
- "Something that's really important to me, is it to be known that like I am not opposed to sex work. I just don't want to be a sex worker." – Cutie [32:24]
- On explaining to her family and emotional fallout: [33:02–34:51]
9. Aftermath and Moral Reckoning
[34:51–36:57]
- The Atriok scandal marks a turn: someone caught paying to view these images is forced to apologize, yet the deeper chain of creation remains hidden.
- "The fallout came quickly. Atriox streamed a tearful apology, his wife in the background, also tearful. Later, he donated $60,000 to a law firm to help with takedowns..." – Sam Cole [35:53]
- Despite apologies, most content remains; Cutie and Atriok continue to work together, underlining the necessity of professional survival amid unresolved harm.
10. The Scope Today: Anyone Can Be Targeted
[36:57–38:24]
- The episode notes the shifting targets of deepfake porn—from celebrities to ordinary people. The makers remain anonymous; the reviewed storylines hint at a global kingpin, “Mr. Deepfakes.”
- "Anyone can make a deepfake of anyone, which means the person uploading it could be your neighbor, your coworker, your best friend. And the person in the video could be you." – Sam Cole [36:57]
Notable Quotes and Moments
- On personal violation:
- "It is so convincingly my body, but not my body. And holy shit, it hits you like a truck."
— Cutie Cinderella [04:54]
- "It is so convincingly my body, but not my body. And holy shit, it hits you like a truck."
- On early technology’s moral sidestepping:
- "Within the computer vision community, Deepfakes comes out. No one really wants to comment on that ... it's not our problem."
— Walter Schreier [25:29]
- "Within the computer vision community, Deepfakes comes out. No one really wants to comment on that ... it's not our problem."
- On emotional trauma:
- "It was the same feeling, like where you feel guilty, you feel dirty, you feel what just happened... it makes that resurface."
— Cutie Cinderella [33:02]
- "It was the same feeling, like where you feel guilty, you feel dirty, you feel what just happened... it makes that resurface."
- On ubiquity of threat:
- "There is no woman in the world who is safe from this technology."
— Jamie (Podcast Host) [37:37]
- "There is no woman in the world who is safe from this technology."
Important Timestamps
| Timestamp | Segment/Topic | |:--------------:|:----------------------------------------------------| | 01:26–05:06 | Cutie Cinderella’s discovery and initial reaction | | 06:55–10:33 | History of fake photography and Photoshop era | | 12:35 | Lux Luker on the early fake porn community | | 18:31–22:03 | Ian Goodfellow invents GANs | | 22:16–25:38 | Reddit’s “deepfakes” and the first AI porn videos | | 25:52–28:25 | Political deepfakes and public panic | | 28:25–34:51 | Victim experience, family impact, and online abuse | | 34:51–36:57 | Atriok’s apology and the limits of accountability | | 36:57–38:24 | Escalating anonymity, ubiquity, and coming investigation |
Tone and Approach
The episode is investigative yet personal, blending deep reporting with the raw, unfiltered voices of both victims and creators. The narrative refuses to sensationalize, instead foregrounding the lived pain, professional complicity, and systemic failures that allow this abuse to proliferate. Technical history is woven with culture and emotion, pushing listeners to engage with the uncomfortable reality: anyone can be a victim, and the chasm between technological optimism and harm remains vast.
Looking Ahead
The closing moments lay out the season’s quest: tracing the global investigation to the Canadian kingpin behind the world’s biggest deepfake porn hub, “Mr. Deepfakes,” and the efforts to hold such actors accountable.
For listeners seeking to understand deepfakes—not only as a technological marvel, but as a tool of violation and exploitation—this episode offers a clear, urgent, and empathetic primer on how we got here, and what it means for all of us.
