The Monopoly Report – Episode 60
Title: Did the CJEU just break the Internet?
Host: Alan Chapell
Guest: Professor Daphne Keller (Stanford Law School, Platform Regulation)
Date: December 24, 2025
Episode Overview
This episode dives into the December 2025 ruling by the European Court of Justice (CJEU) in the RUS Media case—a decision that could reshape how online platforms approach liability for user-posted content, particularly in the context of the GDPR and the longstanding EU safe harbor for intermediaries. Alan Chapell invites Professor Daphne Keller to explain what the decision means for digital media, ad tech, and user rights, dissecting how the court's move upends familiar structures of platform immunity, and how this could impact businesses both large and small.
Key Discussion Points & Insights
1. Setting the Stage: Why Platform Liability Shields Matter
[03:32 - 06:56]
- Alan sets up the issue: The foundation of the internet has been a legal shield (akin to the US’s Section 230) that protects platforms from being liable for user-generated content—a necessity for ad-supported sites and social platforms.
- Daphne outlines the goals intermediary liability laws aim to balance:
- Protect people from online harms
- Protect online speech and access to information
- Promote innovation and competition
- She notes: “Data subjects sit on every side of this equation. Sometimes they want more enforcement by platforms and sometimes they want less.” [06:26, Professor Keller]
2. History of Intermediary Liability in the EU
[06:56 - 10:59]
- Traditional EU law required notice-and-takedown: once a platform knew about illegal content, it had to remove it, but wasn’t obliged to proactively monitor for illegal content.
- The Digital Services Act has built more user-protective process into this, but “general monitoring” has always been forbidden due to risks of surveillance and over-removal.
- However, the GDPR and right-to-be-forgotten cases have introduced another layer, requiring platforms (notably search engines) to remove certain personal information under privacy law, not just platform rules.
- The RUS Media case collided these legal tracks.
3. What Happened in the RUS Media Case?
[11:46 - 14:46]
- Case facts: A Craigslist-like ad included false/harmful content about a woman, with her personal data, posted without consent. Platform (RUS Media) quickly removed it but copies spread to third-party sites.
- Initial Romanian court sided with the woman, granting damages for GDPR violation; appeals court leaned on platform immunity; the highest Romanian court referred questions to the CJEU, asking whether GDPR obligations override the liability shield.
4. The Advocate General’s Opinion vs. the Court’s Judgment
[15:04 - 17:42]
- AG Şpunăr’s opinion (praised by Daphne Keller): Platforms should be considered “processors,” not “controllers,” meaning lighter GDPR obligations and scope for notice-and-takedown to remain.
- Both the AG and the European Commission preferred processor status, to align with established approaches.
- The CJEU instead ruled: The platform is a “joint controller,” holding it fully responsible from the moment content is uploaded—raising the bar far above notice-and-takedown.
“The AG as supported by the Commission was sort of able to sidestep the iceberg that the Court of Justice went straight into by saying ... it's a processor type of a role. And where the court of justice said no, it's a joint controllership role. And that's really the crux of it, isn't it?”
— Alan Chapell [17:19]
5. Why Did the CJEU Reject the ‘Processor’ Argument? Confusions of Joint Controllership
[17:42 - 21:56]
- Unlike previous “right to be forgotten” cases treating search engines as controllers but still allowing notice-and-takedown, the CJEU now reclassifies hosts (like ad marketplaces) as joint controllers, holding them responsible at upload, not just after notice.
- This creates apparent inconsistency: Search engines remain less strictly liable than hosts.
“... The court is saying if you are a host ... and you're a controller, you don't get that sweet deal that Google got. You have to examine everything from the minute it's uploaded …”
— Professor Keller [18:22]
6. Practical Effects: Are Platforms Being Asked to Do the Impossible?
[21:56 - 24:32]
- As joint controllers, platforms must now proactively identify and block illegal/sensitive data—including “special categories”—at scale, likely before publication.
- Daphne’s warning: This is not technically or economically feasible for most platforms. Automated tools are limited; human review is costly and error-prone.
- It risks mass over-removal (stifling speech), or else failing to comply and facing legal jeopardy.
7. General Monitoring Ban: Eroded?
[24:32 - 26:17]
- CJEU claims “this is not a general monitoring requirement”—but gives no clear logic.
- Previous EU law forbade general monitoring because of privacy/speech risks. The new decision seems to sidestep or muddy this safeguard.
8. What Are Platforms Now Obliged to Do?
[26:17 - 29:01]
- Unclear which platforms are covered (ads only, all user-generated content for commercial hosts?)
- Requirements might include:
- Verify advertiser identity (collecting IDs for ad posters)
- Vet ad content before publication for GDPR breaches
- Take steps to prevent third-party rehosting or scraping
- All are resource-intensive and may hit smaller companies hard.
“So to protect data protection rights, we're going to gather ID information from every single person who posts an ad on Craigslist. Good, that sounds very balanced.”
— Professor Keller [27:17]
9. Knock-On Effects and Unintended Consequences
[29:01 - 33:26]
- The lack of clarity on where obligations end, and how far platforms must go, could chill innovation and push small players out of the market.
- National courts may introduce inconsistent tactical limits, creating a patchwork instead of an EU-wide standard.
- Litigation is likely to rise, especially against well-resourced platforms.
“I think there's going to be a lot of hunkering down and hoping the other guy gets sued.”
— Professor Keller [30:20]
“… you can't necessarily expect that the interests of big Tech and Little Tech will always be aligned. … [clarification] may inadvertently … create landmines for … smaller ad supported websites.”
— Alan Chapell [32:54]
10. How Long Until This Gets Sorted Out?
[33:48 - 36:43]
- It could take years for the CJEU to clarify or correct this; lower courts may avoid sending up test cases if they dislike the current direction.
- Litigants’ actions (e.g., Meta possibly settling critical cases) and national court preferences for their own law may slow progress.
“... right now I don't know that anybody has any confidence what answer … the CGEU is going to come back with.”
— Alan Chapell [36:25]
Notable Quotes
-
Professor Daphne Keller:
- “Intermediary liability laws are trying to balance three goals. One is to protect people from online harms … The second is to protect online speech … And the third is to promote innovation and competition.” [05:26]
- “I have no idea [how this is not a general monitoring requirement]. ... They have also said that it is an obligation to check things to make sure they're not illegal. So what is it?” [24:45]
- “There is no mention of or acknowledgement of all of these other nuances about even hosting, much less scraping and data collection.” [29:21]
-
Alan Chapell:
- “[The] underlying rule set that gets created here ... may inadvertently, whether it's intentional or not, create landmines for ... smaller ad supported websites.” [32:54]
- “Applying a one size fits all approach to joint controllership ignores how most of the digital media ecosystem functions in practice.” [36:54]
Important Timestamps
| Timestamp | Segment/Topic | |-------------|------------------------------------------------------------------------------| | 03:32–06:56 | Why intermediary liability shields matter: privacy, innovation, speech | | 06:56–10:59 | Historical approach: notice-and-takedown, no “general monitoring” in the EU | | 11:46–14:46 | RUS Media case facts and procedural background | | 15:04–17:42 | Advocate General vs. CJEU on “processor/host” vs. “controller” | | 18:22–21:56 | Consequences of being a “joint controller” for hosts | | 21:56–24:32 | Feasibility and risks of proactive pre-publication checking | | 24:32–26:17 | General monitoring ban questioned | | 26:34–29:01 | What platforms must do post-decision: identity screens, ad vetting, etc. | | 29:01–33:26 | Impacts on litigation, small vs. big tech, likely business responses | | 33:48–36:43 | How long for clarity? Why lower courts might delay seeking answers | | 36:54–END | Alan’s closing assessment and concerns |
Memorable Moments
- [11:33]: Daphne notes she predicted this collision in a 2018 law journal: “I published an article … in 2018 saying this was coming and advising how I thought it should turn out, which is not the way it turned out.”
- [24:45]: Keller’s exasperated: “I have no idea. ... I just, I don't even know what they mean … what is it?” referencing the court's ambiguous language around “general monitoring.”
- [30:20]: Keller jokes on platform strategy post-decision: “hunkering down and hoping the other guy gets sued.”
- [36:25]: Chapell: “You don't ask a question unless you know the answer. And right now I don't know that anybody has any confidence what answer that the Magic 8 ball that is the CGEU is going to ... come back with.”
Conclusion
This episode offers a deep, candid look at how a single CJEU decision disrupts the balance between privacy law and platform liability in the European digital ecosystem, exposing unpredictable risks for any site or service hosting user content or ads. The hosts emphasize the uncertainty now facing operators, regulators, and users—an uncertainty that could persist for years, especially if national courts slow-walk clarification.
Recommended for: Lawyers, tech company executives, compliance officers, and anyone interested in the future of online speech, privacy, and platform regulation in Europe and beyond.
