Hard Fork: “We Asked Roblox’s C.E.O. About Child Safety. It Got Tense.”
Episode Date: November 21, 2025
Hosts: Kevin Roose, Casey Newton
Guest: Dave Baszucki, CEO and Co-founder of Roblox
Overview
In this episode, hosts Kevin Roose and Casey Newton conduct a direct and probing interview with Roblox CEO Dave Baszucki, focusing on child safety issues on the massively popular gaming platform. With headlines swirling about lawsuits and allegations related to child exploitation and platform safety, the conversation explores what Roblox is doing to address these challenges, the technological and policy shifts underway, and the broader implications for online spaces for young people. The dialogue is at times tense, with the hosts pressing Baszucki on past failures and current improvements.
Key Discussion Points & Insights
Roblox’s Scale and Demographic Realities
- Platform Growth: Roblox has over 150 million daily active users, with concurrent users sometimes exceeding 47 million (03:15).
- Age Range: Almost 40% of users are under 13, making Roblox one of the largest online gathering spaces for children—unlike most other major platforms (04:07).
- Baszucki on Demographics: He emphasizes that the fastest-growing user segments are now 21 and above, though acknowledges much of this data is “self-reported” (12:59, 13:36).
Child Safety Challenges & Lawsuits
- Recent Incidents: Stories of predators using Roblox to contact children and lure them to other platforms for exploitation (04:36).
- Legal Pressure: Multiple U.S. states have filed lawsuits against Roblox for enabling child sexual exploitation. Over 20 federal lawsuits have been filed in 2025 alone (05:13).
- Policy Response: Roblox is rolling out new safety features, including age estimation via face scans to use chat, and tightening communication between age groups (05:41, 11:44).
The New Age-Gating and Facial Age Estimation
- Face Scan for Chat: Users must now undergo a facial age estimation (using AI) to access chat features, complementing other signals like behavioral patterns, photo ID uploads, and self-reported age (11:44).
- On Privacy: Baszucki assures that “image isn’t stored,” and the process is “ephemeral” (11:57).
- Combining Signals: “Mathematically, the more signals that are convoluted, the better the accuracy and the better we can get.” (11:50, Baszucki)
- Continuous Checks: Age checks may not be one-time; they can be ongoing based on suspicious behavior or signals (14:03).
Platform Vulnerabilities to Predators
- Historical Laxity: For years, adults could contact any minor on the platform, a design decision the hosts challenge (15:09).
- Baszucki’s Defense: Argues the company prioritized heavy text filtering, sometimes at the expense of user experience: “There’s a lot of Roblox memes and themes…everyone’s just saying hashtag, hashtag, hashtag.” (15:42)
- On Filter Evasion: Admits some sophisticated predators evade controls, but insists constant innovation and AI have led to continued improvements (16:37).
- Movement Off-Platform: Many predators try to move kids to platforms with less moderation; Baszucki claims Roblox blocks explicit attempts to exchange usernames, though admits “the techniques are much more sophisticated” now (18:46, 19:23).
Open Chat: Why Not Turn It Off?
- Hosts Push Back: “Why have a sort of open-ended text communication feature at all on a game that is very popular with young people?” (19:42, Newton)
- Baszucki’s Rationale: Believes communication is central to Roblox’s value, citing stories where isolation has been mitigated by platform friendships. Also claims Roblox’s safety and moderation surpass most in the industry (20:14).
Moderation, AI, and the Economics of Safety
- Manual vs Automated: Responding to Hindenburg Research’s allegation that Roblox’s spending on trust and safety had dropped, Baszucki says they’ve shifted from manual moderation to more effective automated, AI-driven systems: “Would you ask the same situation of someone who converted from hyper manual labor... to an assembly line?” (26:46, Baszucki)
- Skepticism: Hosts evoke comparisons to Facebook’s failed content moderation promises; Baszucki insists Roblox is different and that scale is “mind-boggling” (29:53, 30:41).
- Transparency Limits: “We see all of those metrics internally, all the time. They’re not publicly shared.” (33:13)
Evaluating Success & Parental Responsibility
- Baszucki’s Measure: Internally tracks filter rates and error rates, but at the end of the day, sees parents as “the ultimate arbiter of responsibility” (33:13, 34:19).
- Hosts' Concerns: Express frustration that tech platforms chase scale then scramble to solve safety issues after the fact, rarely being proactive enough (23:59).
Screen Time, Prediction Markets, and Broader Reflections
- Screen Time: Baszucki admits his own parenting started with strict screen limits, loosened over time, and believes there’s nuance between types, timing, and social context of screen use (36:33).
- Prediction Markets: Baszucki (light-heartedly) discusses his fandom for PolyMarket and muses on how prediction games could have educational value on Roblox—though legal issues make it tricky (38:04–41:07).
- Memorable Light Moment: Newton jokes, “Start ‘em young. That’s what I always say when it comes to gambling.” (40:47)
Notable Quotes and Memorable Moments
-
On Age Verification & Safety
“We’re complementing [existing user signals] with a facial age estimation signal that literally, in an ephemeral way ... uses your phone, use your PC, scan an image, do a good AI estimation, image isn’t stored, and start using that to complement all the other things we do.”
— Dave Baszucki (11:44) -
On Content Moderation at Scale
“The scale we’re at is absolutely mind-boggling right now. Like, absolutely mind-boggling. And every system, image review, text review, all of that—constantly getting better and better.”
— Dave Baszucki (30:41) -
Host Critique on Platform Growth and Safety
“What a lot of parents want is some sense that people have a plan for what is gonna happen when the 25 million people show up... and [not to] wait till 20 years in existence to say maybe we should like sort these people into age bands...”
— Kevin Roose (24:07) -
On Manual vs AI Safety Spend
“Would you ask the same situation of someone who converted from maybe hyper manual labor, making cars by hand, to an assembly line?... The assembly line might be infinitely better and at the same time not have so many people.”
— Dave Baszucki (26:46) -
Baszucki on Parental Choice
“If a parent, out there for any reason, is not comfortable— their kid going to the park, not comfortable their kid playing with a certain toy, not comfortable being on Roblox— who am I to say, right?”
— Dave Baszucki (34:19)
Timestamps for Key Segments
- Platform Growth and Demographics – 03:15–04:36
- Safety Concerns & Lawsuits Overview – 04:36–05:41
- Age Estimation and Policy Shift – 10:24–14:50
- Historical Lax Moderation / Open Communication – 15:09–20:14
- Why Not Disable Chat? & Community Value – 19:42–21:38
- Moderation, Transparency, and Hindenburg Allegations – 25:52–28:31
- Content Moderation at Scale and Tech Optimism – 29:53–32:55
- On Metrics and Parental Responsibility – 33:13–34:19
- Screen Time Debate – 36:33–37:55
- Prediction Markets & Light Discussion – 38:04–41:23
- Closing Challenge: Will Roblox be safe for Newton’s son in a year? – 41:29–42:13
Episode Tone and Dynamics
- The tone is direct, at times adversarial, but grounded and respectful. Baszucki is on the defensive, but often leans into policy explanations and tech optimism.
- Several moments of levity and inside tech jokes, especially toward the end with prediction markets and childhood gambling references.
For Listeners Who Haven’t Tuned In
This episode offers a rare, direct interrogation of one of the most important tech CEOs for parents and policymakers today. It covers not just the granular details of platform safety changes but also asks larger questions around tech company responsibility, the realities of moderation at immense scale, and how to balance innovation with the protection of millions of children. The interview is both enlightening and, at moments, frustrating in its back-and-forth, making it a valuable listen for anyone concerned with kids’ safety online or the future of digital platforms catering to young people.
