The 404 Media Podcast
Episode: Exposing the People Behind Deepfake Porn Sites with Bellingcat Investigator Kolina Koltai
Date: January 26, 2026
Host: Sam (404 Media)
Guest: Kolina Koltai (Senior Researcher, Bellingcat)
Episode Overview
In this episode, Sam from 404 Media interviews Kolina Koltai, a senior researcher at investigative outlet Bellingcat, about her recent research exposing the operators behind deepfake porn websites. Koltai discusses her investigative techniques, the nature and impact of these sites, and the broader implications of non-consensual deepfake pornography (NCII), especially as it relates to minors. The conversation explores the challenges of accountability in the online deepfake ecosystem, the absence of meaningful regulation, and the crucial role of journalism in tracking, exposing, and shutting down malicious actors.
Guest Background: Kolina Koltai (01:00–07:00)
Koltai’s Path to Deepfake Investigations
- Originally started her career researching automated technology trust at NASA Ames, working on how humans interact with technology.
- Transitioned into studying vaccine misinformation for her PhD at the University of Washington, focusing on how social media and digital platforms influence trust and information spread.
- Noted the mismatch between the speed of academic research and the rapidly evolving landscape of online misinformation and disinformation:
"I felt like it wasn't enough. It wasn't at the speed that misinformation was being produced and spreading online and we needed more." (05:12, Koltai)
- Briefly worked on Community Notes at Twitter (before it became X), then joined Bellingcat around 2022 as generative AI image technology was rising.
- Entered deepfake investigation initially through a concern for AI's role in misinformation but quickly realized "the bigger harm... is actually non-consensual deepfake pornography." (06:00, Koltai)
The Rise of Deepfake Porn & Ongoing Harm (07:01–09:08)
- Despite years of attention, the issue is growing worse:
"Surely in six years... we would have come to some kind of technological or social consensus on where we stand as a society... and it just has gotten so much worse since then." (07:09, Sam)
- Most recent, high-profile example: X's (formerly Twitter’s) AI platform Grok generating non-consensual deepfake porn, including of minors—a problem anticipated by experts long before mainstream awareness.
Exposing the Operators: The Investigation (09:09–20:00)
Focus on Two Deepfake Sites
- Koltai describes her investigation targeting two notable sites: Reface Porn and Deepfake Porn, suspected to be owned (alongside several smaller sites) by a Hungarian man named Mark Reson.
- Chose these sites based on popularity in AI-porn “top 100” lists and how straightforward they were to investigate.
Step-by-Step: Unmasking the Owner
- Domain Registration/Historical WHOIS (12:16)
- Searched domain records for early, unredacted ownership—quickly found Reson’s business, Facitic, registered as owner.
- Google Analytics Tag Matching
- Compared Google Tag IDs on the fake porn sites with Reson’s other, legitimate ventures to build connections.
- Business Records
- Searched UK Companies House and other public registries for corporate overlaps.
- Leaked Data Sets/Email Cross-referencing
- Mapped reused email addresses and passwords across sites and found overlapping user data in leaked credential dumps.
"...he used the same password like many people do across multiple sites, which I've done as well." (17:45, Koltai)
- Test Payments/Rogue Wallets
- Attempted purchasing credits on-site, tracing transactions through processing companies like PurWallet.
- Stumbled upon full dashboards (in PurWallet) showing sales, associated emails, and business ties—all lining up with Reson and his businesses.
"...on Pur Wallet's email to me...they have a link to a dashboard so you can...see what website domains are associated with Pure Wallet, which included...AI Remaker Me, which...listed [Reson's] businesses publicly." (18:47, Koltai)
- Financial Windfall
- Discovered at least $300,000 USD earned through just one payment processor, likely a small part of total revenue.
The Nature and Impact of Deepfake Pornography (20:00–24:00)
- The sites blatantly advertised themselves (“the very first thing you’re presented with is a woman giving a blowjob...her face switching to different women’s faces”—20:53, Koltai).
- No safeguards against CSAM (child sexual abuse material); sites accepted AI-generated images of “teenage girls and of young girls...like 5, 6, 7, it would be my guess.”
"This site had like no protections on it." (22:16, Koltai)
Reaching Out to the Operator & Aftermath (28:59–34:42)
Attempting Contact
- Sent multiple emails for comment to all of Reson’s linked addresses—received no reply.
- Sites and supporting infrastructure began vanishing after being contacted, suggesting owner received and reacted to the outreach:
"...the owner of the site requested that those [Wayback Machine] sites...were removed." (30:16, Koltai) "...he very clearly was, I think ashamed and tried to hide a little bit that he was identified." (31:00, Koltai)
Emotional Impact and Site Takedown
- Satisfaction and sense of impact when a site vanishes as a result of an investigation:
"Ultimately that's one of the things I want to be the end result of these investigations. I want the sites to come down. I don't want people to be able to use that site for any additional services. I don't want them to make money on it." (32:48, Koltai)
- Contrasts this with frustration over “zombie” sites—where sites die, but the harmful content remains online, unmaintained.
Parallel Case: The “Mr. Deepfakes” Investigation (34:42–38:00)
- Bellingcat, CBC, and partners unmasked the man behind Mr. Deepfakes—a Canadian pharmacist, David Doe. He, too, scrubbed his digital presence and took the site offline after contact.
- The investigation required “weeks—no, years—in the making” including in-person attempts to get comment. CBC filmed Doe at home, refusing to talk.
“It took until him being cornered by CBC...for the site to come down...We don't threaten to say you should turn off the site...We just say, 'hey, this is what we're publishing. Is there anything you want to say for yourself?' And they decide to run and hide." (38:40, Koltai)
Broader Harm
- The site’s forum enabled requests for custom deepfakes (“bespoke materials”) of non-celebrities—schoolmates, friends, journalists.
- Victims’ images linger and circulate indefinitely online; impact is impossible to fully reverse, but shutting down production and central storage is major progress.
Why This Reporting Matters (37:25–43:41)
- Accountability only comes through sustained, high-effort journalism, not authorities or laws (which often do not cover NCII or deepfake porn).
- Koltai argues that “anyone is capable of doing these types of investigations...because it is open source.”
- For Koltai, personal empowerment and agency comes from creating measurable impact through investigative work:
“This work gives me some power, some agency that something can be done by identifying these owners. They get so ashamed, so scared that they turn off the site. You know, they run and hide. Those sites die. And that is an impact that gets me up every single day." (42:19, Koltai)
The Legal and Regulatory Vacuum (45:25–49:50)
- Current legislative focus is on criminalizing creation and distribution of NCII—not on site operators or profiteers:
"Owning a website like this is not illegal. There is no consequences for that. And to me, that is insane..." (47:06, Koltai)
- Need to “add friction” and penalties for those who profit, not just those who produce content.
- Stresses the disconnect: “We give these guys more rights and protections than they have afforded the women that are victims of their sites...” (46:53, Koltai)
Notable Quotes & Memorable Moments
-
On the endless, compounding harm:
“...as much as a whack a mole game it could feel between these big sites and the small sites, we need more people doing this work. I'm just one woman, we're just one organization.” (41:52, Koltai)
-
On open source investigation & collective action:
“Anyone is capable of doing these types of investigations because it is open source. You know, we're using things that are available on the Internet out there to be able to investigate the ownership of these sites.” (42:58, Koltai)
-
On legal inaction:
“…the major gap in legislation and consequences is that these websites are not illegal. This kind of technology has zero regulation, zero laws. Owning a website like this is not illegal. There is no consequences for that. And to me, that is insane…” (47:06, Koltai)
-
On the bar for public outrage:
“…if AIC SAM isn't the line, isn’t the bar, then where is it? I mean, that is usually like, you know, non consensual pornography of minors is usually where everyone’s on saying that, hey, that's not good.” (45:51, Koltai)
Closing Thoughts (49:51–End)
- There is an urgent need for both legislative action and collective public pressure; shame and exposure remain the only reliably effective deterrent—at least for now.
- Koltai urges more journalists and technologists to join in open-source investigations, emphasizing shared responsibility and the real, measurable difference such work can make.
[End of Summary]
For more reporting, subscribe to 404 Media and follow Bellingcat for continued investigations into online harm and technological exploitation.
