Mobile Dev Memo Podcast
Season 7, Episode 12: Considering the Future of Section 230 (with Ben Sperry)
Release Date: April 7, 2026
Host: Eric Suefert
Guest: Ben Sperry, Senior Scholar of Innovation Policy at the International Center for Law and Economics
Episode Overview
This episode explores the recent landmark jury decisions against Meta and Google regarding social media platform liabilities, the bypassing of Section 230 protections, and the far-reaching implications of these cases for online speech, algorithmic design, and the broader consumer internet. Eric Suefert and Ben Sperry probe into the legal theories deployed, the potential chilling effects on free speech, the operational impact on tech companies, and the future of platform immunity.
Key Discussion Points & Insights
1. Background and Introduction
- Ben Sperry introduces himself as a Senior Scholar of Innovation Policy at ICLE, specializing in civil liberties, online speech, First Amendment, Section 230, and products liability as applied to online platforms. He authored an amicus brief in Massachusetts v. Meta, similar in arguments to the cases discussed.
- [01:36–02:46]
"Applying products liability to online speech platforms is a difficult fit. It could likely result in a lot of collateral censorship...when those platforms seek to avoid liability for user generated speech by restricting the speech itself."
— Ben Sperry
2. Overview of the California and New Mexico Cases
- California (KGM vs. Meta):
- Private products liability lawsuit for injuries suffered as a minor, alleging addictive design features.
- Considered a bellwether due to 1,600 similar consolidated plaintiffs in California and 10,000+ nationwide.
- Meta and Google ordered to pay $3 million in compensatory damages and $3 million in punitive damages; Meta responsible for 70%.
- New Mexico vs. Meta:
- Brought by the state under consumer protection law (Unfair Practices Act).
- Focused on Meta failing to protect children from predators and misleading about platform safety.
- Jury awarded $375 million in damages.
- [03:24–05:11]
"While these are actually not that big of amounts for Meta or Google ... tens of thousands of similar cases ... could result in substantially higher damage amounts if the same outcome results."
— Ben Sperry
3. Legal Theories and Bypassing Section 230
-
Design vs. Content: Both cases focused on platform design features (not user-generated content).
- California: Products liability tied to addictive features (autoplay, infinite scroll, etc.).
- New Mexico: Consumer protection claims, especially regarding facilitation of predator activity and harm to minors via design.
-
Bypassing Section 230:
- Courts accepted framing that the cases targeted the platform's conduct (design) rather than third-party content.
- [07:56–10:31]
"Both courts accepted the framing that these cases were not about the underlying content, but about the conduct of the social media companies and how they designed their platforms."
— Ben Sperry
- Courts accepted framing that the cases targeted the platform's conduct (design) rather than third-party content.
-
Contrast with Federal Precedent:
- Many federal courts previously applied Section 230 to claims ultimately about third-party content.
- Example: Netflix's algorithmic recommendation case for “13 Reasons Why” was dismissed.
4. Potential for Widespread Litigation
- Ripple Effects:
- Features like algorithmic recommendations, infinite scroll, etc., are common across many consumer-facing tech companies.
- Large platforms like Meta & Google might afford compliance or damages, but this poses an existential threat for smaller platforms and new entrants.
- Section 230 was intended to guard against "ruinous litigation."
- [11:22–14:10]
“If the courts continue to follow this line of reasoning...there's going to be a lot of litigation. And the question will then be how do products liability type suits interact with the First Amendment?”
— Ben Sperry
5. Practical Compliance and Product Changes
- Immediate Impact:
- Companies plan to appeal; likely to request stays on product changes.
- Already considering age-gating, separate products for minors, and more stringent age verification.
- The magnitude of future damages may drive practical product changes (analogous to McDonald's lowering coffee temperature after a famous lawsuit).
- [14:28–17:38]
"It's probably the threat that if other cases go the same way and you start to multiply the plaintiffs many, many times over, that might actually make them change what they do."
— Ben Sperry
6. Implications for Free Speech
-
Platforms Redefined:
- Courts treated social platforms as potentially dangerous engineered products, not as forums for speech.
- Likely result: Overbroad removal of lawful, protected speech (including content aimed at minors).
- [18:32–21:30]
"Speech platforms will probably remove a lot of speech that is protected by the First Amendment law if it could be seen as potentially harmful to minors."
— Ben Sperry
-
Age Gating Concerns:
- Stronger age verification could exclude minors from platforms for fear of liability, despite legal precedent protecting their free speech.
"It seems odd that you could get the same effect by alleging that it's negligent, or somehow a product's liability ... seems a little incongruous. But that's the result that could end up happening..."
— Ben Sperry
- Stronger age verification could exclude minors from platforms for fear of liability, despite legal precedent protecting their free speech.
7. Why Start with General-Purpose Social Media, Not Child-Focused Platforms?
-
Questions raised about why litigation focuses on general platforms like Facebook rather than child-centric ones like Roblox, despite shared mechanisms.
-
Sperry argues harm prevention (e.g., predator access) should be regulators’ core concern, rather than algorithmic design per se.
- [21:30–26:26]
“Maybe they should really have been focusing on...are these companies doing enough to prevent unwanted actions by adults to go after kids and use their platforms to do so?”
— Ben Sperry -
Motives of Litigators:
- Tendency to pursue deep-pocketed, high-profile targets rather than the worst actors.
- [28:20]
"When you have a hammer, everything looks like a nail... to go after the deep pockets and quite frankly, entities that don't have the best PR right now..."
— Ben Sperry
- Tendency to pursue deep-pocketed, high-profile targets rather than the worst actors.
8. Risk of Over-Censorship and Chilling Effects
-
Collateral Censorship:
- Platforms, fearing liability, might over-censor speech (including satire, inside jokes, heated debate).
- Algorithms and even human moderators can't reliably make nuanced legal distinctions in real time.
- [30:00–34:11]
"If the underlying liability for, like, bullying, starts to extend to them, both the protected and unprotected speech will likely just be removed."
— Ben Sperry
-
Loss of Section 230:
- Without Section 230, platforms may err on the side of removing any contentious or risky speech.
"That's what Section 230 is supposed to prevent, literally. But that's the world we could end up being in if that's not the case anymore."
— Ben Sperry
- Without Section 230, platforms may err on the side of removing any contentious or risky speech.
9. Broader Impact on the Consumer Internet
- If Appeals Fail:
-
End of platform immunity would embolden regulators and encourage covert government influence (“backdoor censorship”).
-
Threat of litigation could reshape all user-generated content platforms (search engines, e-commerce reviews, streaming, podcast apps) to limit risky features or content.
- [36:26–38:59]
"If appeals courts don't reverse or limit these rulings, I think free expression online is in trouble... Backdoor censorship could become the norm if social media platforms increasingly feel the need to change how they do business to stay on the good side of regulators."
— Ben Sperry"It goes way beyond social media if it becomes about algorithmic recommendations...It could be disastrous not only for just speech, but just for how we do things online that we've come to accept."
— Ben Sperry
-
10. E-commerce, Chatbots, and Generative AI
- Ripple Effects for Other Industries:
- E-commerce platforms like Amazon could face liability based on recommendations or reviews.
- Generative AI and chatbots (LLMs) pose new questions about who is responsible for the speech they output, with Section 230 protections possibly not applying.
- [40:12–42:05]
"LLM chat bots are going to be really interesting ... I don't 100% think that chatbots would receive Section 230 immunity because they seem to be...more analogous to...integrates all that information into new speech, it seems...it might be their own speech to some degree."
— Ben Sperry
Notable Quotes & Memorable Moments
-
On Section 230 and Product Design:
"The fact that a design feature like Infinity scroll impelled a user to continue to consume content that proved harmful does not mean that there can be no liability for harm arising for the design feature itself."
— Ben Sperry (paraphrasing the California judge) [08:25] -
On Over-Censorship:
"Online platforms are unlikely to make these fine distinctions. If the underlying liability...starts to extend to them, both the protected and unprotected speech will likely just be removed."
— Ben Sperry [31:20] -
On the Downstream Consequences:
"Backdoor censorship could become the norm if social media platforms increasingly feel the need to change how they do business to stay on the good side of regulators. Even without actual legislation or public administrative action or an actual lawsuit, just a raised eyebrow could suffice."
— Ben Sperry [36:26]
Timestamps for Important Segments
- Ben Sperry’s Background & Research: 01:36–02:48
- Case Overview (California & New Mexico): 03:24–05:11
- Legal Arguments and Section 230 Maneuvering: 07:56–10:31
- Implications for Other Tech Companies: 11:22–14:10
- Compliance, Product Changes & “Hot Coffee” Parable: 14:28–17:38
- Free Speech Ramifications: 18:32–21:30
- Algorithmic Curation & Over-Censorship: 30:00–34:11
- Impact If Appeals Fail & Future of Platform Immunity: 36:26–38:59
- Generative AI, Chatbots, and Novel Section 230 Questions: 40:12–42:05
Conclusion
Sperry urges careful judicial review and intervention by appellate courts, highlighting the massively disruptive potential of these verdicts for both free speech and market competition online. The fate of Section 230—and the openness of the internet as we know it—may hinge on what happens next in these cases.
Further Reading:
- Ben Sperry’s article: "Treating Speech as a Bug, Not a Feature" on Truth on the Market (linked in episode show notes)
- ICLE publications and resources on innovation, regulation, and speech
