Loading summary
Audiohook Announcer
This podcast is brought to you by audiohook, the leading independent audio dsp. Audiohook has direct publisher integrations into all major podcast and streaming radio platforms, providing 40% more inventory than what could be accessed in omnichannel DSPs. What's more, audiobook has full transcripts on more than 90% of all podcast inventory, enabling advanced contextual targeting and brand suitability. Audio Hook is so confident that in addition to CPM buys, they offer the industry's only pay for performance option, where brands can scale audio and podcasting with peace of mind, knowing they are only paying for outcomes. Visit audiohook.com to learn more. That's audiohook.com.
Professor Daphne Keller
You'Re about to make a trade. Which u do you listen to? Is it get optioning those options or let's do a little research? Learn more@finra.org TradeSmart.
Alan Chappelle
Welcome to the Monopoly Report the Monopoly Report is dedicated to chronicling and analyzing the impact of antitrust and other regulations on the global advertising economy. I'm Alan Chappelle. I'm a privacy and regulatory attorney and have worked with hundreds of digital media and ad tech companies over the years. I also publish a monthly regulatory outlook for digital media worldwide called the Chappelle Report. You can find a link to a sample copy of the Chappelle Report in the Show Notes. This week my guest is Professor Daphne Keller. Daphne is a prominent legal scholar at Stanford Law School, focusing on Internet law, platform regulation, and user rights, directing the program on platform regulation at the Cyber Policy Center. She previously led intermediary liability research at Stanford center for Internet and Society and served at Google as Associate General Counsel handling search products. Keller teaches Internet law, testifies globally on platform content moderation, and writes on free expression, digital governance and user protections. Holding degrees from Yale Law School and Brown University, I'll place Daphne's complete bio in the Show Notes. Here's one of the cool things about having a podcast. When I want to better understand how something works or or get a clearer sense of an area of law in which I'm unfamiliar, I can just invite someone on to help educate me and my audience. And that is particularly true here today. Let me set the stage a bit. In December of 2025, the EU Court of Justice rendered a decision in a case called RUS Media, and the case touched on both the gdpr, which I know something about, and the platform liability provisions of the EU's E Commerce Directive, which is an area that I know less about. So Professor Daphne Keller was kind enough to come on and help me understand this decision and its implications on the digital media marketplace. So let's get to it. Hi Daphne, thanks for coming on the pod. How are you?
Professor Daphne Keller
Great. How are you doing?
Alan Chappelle
I'm doing okay. Do you have any big plans for the holiday season?
Professor Daphne Keller
I have some travel with family, but first I have to get my syllabus written.
Alan Chappelle
Ah, that's the important try to get everything done before you head out of town. Deal. So we're heading down to Nicaragua where my wife has family and we're merging the two families together on the beach in Nicaragua.
Professor Daphne Keller
Amazing.
Alan Chappelle
So it should be one part complete tranquility and then you know what happens when families get together then maybe some chaos. I'm sure the cool thing about the chaos down there is it's usually of the fun kind of so today we are talking about a recent EU Court of justice decision called RUS Media, where the court of justice has sort of pulled the plug on a foundational concept of the Internet, where the law provides a liability shield for Internet platforms such as online marketplaces, so they aren't responsible for the content that gets placed on other platforms. For example, you don't want an online newspaper responsible for what's said in the comments section. So some of my audience may be familiar with Section 230 of the Communications Decency act, which functions similarly so with respect to content and ads placed on Internet platforms. This EU Court of Justice ruling seems to shift the burden from a reactive notice and takedown model to a more proactive prevent before publication approach, especially for sensitive data. Daphne, have I framed this correctly?
Professor Daphne Keller
That all sounds about right. You maybe. I would point out that the analogy to section 230 is kind of loose because 230 is a complete immunity for platforms for some kinds of claims, like defamation claims. And the EU system has always been a notice and takedown system or a system where once the platform gains knowledge of specific content, then they have to take it down to avoid liability.
Alan Chappelle
Ah, right. That's a really, really helpful and important distinction. So thanks for clarifying. So we probably need to take a step back and talk about how intermediary liability works in Europe. But before we do, I want to address the key question, why is having an intermediary liability shield important from the point of view of an ad supported site, from the point of view of a social platform, from the point of view of a data subject.
Professor Daphne Keller
So what I usually tell my students is that intermediary liability laws are trying to balance three goals. One is to protect people from online harms from dangerous content or illegal content on the Internet. The second is to protect online speech and access to information. And the third is to promote innovation and competition. And those are in tension with each other. So the details of the laws usually are kind of like dials and knobs to calibrate how policymakers want to balance those things. So for a data subject, which is like a person in, in their capacity as, as someone with privacy rights or data protection rights in their personal data, they might find themselves on the side of this where they're being harmed by online speech about them and they want it taken down. But they might also find themselves on the side of this where they don't want platforms constantly surveilling their speech to figure out if they're saying something illegal. You know, the EU Court of Justice has recognized before that one reason not to want platforms out there monitoring constantly for illegality is because that impinges on the data protection rights of users. It creates more surveillance and more of sort of almost like a privatized police state where there's a company checking in to see if you're complying with the law all the time. So I think data subjects sit on every side of this equation. Sometimes they want more enforcement by platforms and sometimes they want less.
Alan Chappelle
That makes sense. And can we take another maybe half step back and maybe I would love it if you would share your view of how the concept of platform liability operated within the EU prior to this Rust Media case?
Professor Daphne Keller
Yeah. So the way I see it, there were kind of these two parallel tracks that weren't acknowledging each other. The main one, the famous one, is the what is now the Digital Services act in the eu. The Digital Services act is this sort of once in a generation overhaul of the laws for what platforms have to do with online speech, when they have to take down illegal stuff, how they moderate content, et cetera. And that builds on laws that Europe has had, the EU has had going back to 25 years now, which as I said, is basically like once the platform knows about something illegal, then they have to take it down. But the law has also spelled out that whole time that the they can't be compelled to do so called general monitoring, which is going out and proactively reviewing everybody's speech in pursuit of finding illegal stuff. What the DSA did there is they just added a whole bunch of Cory. Well, not just, but a big thing they did there was to add a lot of choreography around notice and takedown to make sure that users get noticed when their content gets taken down, that they have an opportunity to appeal that there's extra transparency, like a lot of process that's intended to protect users from having their speech silenced by false accusations. Because there's a stack of evidence that in notice and takedown systems like the DMCA for copyright in the us, you know, it brings on the trolls. All kinds of people come along with false accusations trying to silence speech they don't like online. And platforms often fall for it and protect themselves by taking down too much. So that's the main intermediary liability system that, you know, any EU legal expert would tell you about. And then this sort of stealth other thing running in parallel is the GDPR and the Rust Media case that we're going to talk about is those two things colliding? But you know, this goes back to a case I actually worked on when I was still at Google, the right to be forgotten case from the CJEU in 2014. That was the CJEU saying also data protection rights under the GDPR's predecessor law, those can be a reason to take content off the Internet, you know, and, and we all know the GDPR is a reason platforms have to like honor erasure obligations for logs and profiles and like the back end data held in databases. That's pretty normal. But what this ruling made clear is, oh, and also online speech, like the content of a newspaper website indexed by Google, which is what that case was about, that can be subject to GDPR removal obligations. And so there was a series of these cases about search engines and the right to be forgotten and when and how they have to delist particular search results. And over the course of four cases, the CJEU actually kind of moved closer to what the DSA did, like having an idea that maybe the person demanding takedown should have the burden of proof. That was in a case called TU and RE a couple of years ago. And the court said even for sensitive or special category data, that's like health information, sexual orientation, religion, political affiliation stuff, where the GDPR has quite stringent rules about when and how they can be processed. And even for that stuff, search engines just have these notice and delisting obligations. So, you know, that seemed not too far apart from the regular notice and takedown system. And then along comes Rust Media, the case we're talking about today, and completely overhauls that and gives us some new rules based on the gdpr.
Alan Chappelle
Yeah, and just one observation, I think you really described well, that there's these two sort of related concepts that have been hiding under the bush for a while. And so as I've looked through a lot of the commentary on the Rust Media case. There was sort of like one camp who was saying, oh, my goodness, this is a huge surprise. And then there was another camp saying, well, no, actually I pointed to this. I thought that this might happen. Maybe I didn't think it was going to happen and blow up exactly like this, but I think there are people that saw this one coming, which is kind of interesting.
Professor Daphne Keller
Yeah, well, I mean, I'll claim that one because I published an article in the Berkeley Tech Law Journal in 2018 saying this was coming and advising how I thought it should turn out, which is not the way it turned out.
Alan Chappelle
Wow. Very prescient of you, Daphne. Well done. Let's give some background on the case. A Romanian company called Rust Media Digital operated an online marketplace that is roughly similar to Craigslist, where advertisements may be published either free of charge or for a fee. Back in 2018, an unidentified user posted an ad on the marketplace. The ad falsely represented a woman as offering sexual services, and the ad included her photographs and her telephone number. Obviously, all of this was posted without the woman's consent. And the woman informed Rust Media of the false ad. And to their credit, Russ Media removed the ad within an hour. Unfortunately, the ad had also been copied and or shared so that it appeared on a number of third party websites. So the woman sued. Now, what were her claims and what happened as the case made its way through the Romanian court system?
Professor Daphne Keller
Yeah, so, I mean, I think you and I are old enough that we remember this kind of thing being written on bathroom walls. Right. Like this kind of false statement about, you know, the sexual availability of women is something that high school ever. Yeah. And comes up in platform liability quite a bit. There's a major. More than one major230 case in the US that are about exactly that. So it's what. It's one of those cases. And this poor woman, she went to court in Romania and the first instance, and her claims were data protection, which is the. The GDPR claim we're talking about. And then also some separate rights under Romanian law about personal portrayal and honor and right to privacy, which kind of get eclipsed in the GDPR ruling. So the first court kind of treated this through the GDPR track and they upheld her claims and said, yeah, this marketplace or ad site should not have had this at all. Even though they took it down after an hour of notice already they had violated the plaintiff's rights and they awarded her €7,000. And then an appeals court said, no, wait, A minute. This is immunized activity under the hosting safe harbor that's in the DSA now and was already in this previous law, the E Commerce Directive, at the time the case came up. So that appeals court took the opposite tack. And then as far as I can tell, there's yet a third layer of courts, and that's who refers the question up to the cjeu. And the question it's asking the CJEU to address is, is about the intersection of the GDPR and the safe harbor provisions, the intermediary liability provisions. And then it gets into some detail. I don't know if we'll get into them. If people want to dig into the case about how much it matters, what the Terms of Service said, whether specifically the platform needs to pre check the identity of advertisers, whether they have to pre check the content of the ad, and also whether they have to have some safeguards to prevent third parties from copying and distributing the ad further on.
Alan Chappelle
Got it. Okay. And now the case made its way to the EU Court of Justice. So one of the courts referred a couple of different questions to the eu, the, the cgeu. Now, prior to the CGEU rendering a decision, the Advocate General renders an opinion. Now, what did the AG opinion say?
Professor Daphne Keller
So this is AG Spoonar, which is, you know, pretty swoony. He's a good one. Also, he cited me. So obviously it was a really well considered opinion. And his bottom line was that there is a way to reconcile the GDPR rules and the intermediary liability rules and make them be read in harmony under these two major European laws that might have, you know, potentially could be in conflict here. But the way he gets there is kind of a fork in the road the court doesn't take, which he says under data protection law, the platform should not be considered the controller of personal data, which is the category of the more heavily regulated category, the controllers, who decides the purpose and how their data is going to be processed, and therefore has a bunch of obligations under the gdpr. He says that's not what the platform is. The advertiser is the controller. The advertiser is the one who decided to talk about this woman and provide her, I think maybe her phone number, contact information. The platform is just a processor, is just following the instructions of the advertiser. And that under the GDPR is the category of more lightly regulated category. As long as you're just doing what your controller told you to do, like you're a vendor or some kind of outsourcer For a, you know, for a company, you're likely to be a processor and have lighter obligations. So by saying that the platform is just a processor, the AG's opinion is able to go through some more steps and say they can have an obligation just to take down when notified and to be immunized under the intermediary liability laws and sort of exist in this world of notice and takedown. That is pretty familiar. And I know that the European Commission weighed in and you know, showed up during the oral arguments and presented their position and from what I can understand in the AG's opinion that the European Commission agreed with this. They also said the platform should be treated as a processor and therefore have lighter obligations and be able to just do a notice and take down system.
Alan Chappelle
So if I'm understanding correctly, the AG as supported by the Commission was sort of able to sidestep the iceberg that the Court of Justice went straight into by saying, no, no, no, it's a processor type of a role. And where the, the court of justice said no, it's a joint controllership role. And that's really the crux of it, isn't it?
Professor Daphne Keller
Yeah, exactly. The AG and the, and I think the Commission were saying the platform's just a processor, that's a passive role, that's a neutral role. They're just following instructions. And therefore they can qualify both as an immunized host for intermediary liability purposes and as a processor for GDPR purposes. But what's a little weird about that is if you go back to these right to be forgotten cases about Google, the court said Google's a controller, the court said Google is a controller. And at the end of the day they just have to do notice and delisting. And that involves a lot of hand waving and maybe like fudging into the language of the gdpr. But they did get there for search engines. And so one thing that's weird about this case is now the court is saying if you are a host, like this Craigslist like defendant in this case, and you're a controller, you don't get that sweet deal that Google got. You have to examine everything from the minute it's uploaded, you take responsibility for it from the minute it's uploaded. And meanwhile Google and other web search engines get to just do notice and takedown. So that's pretty hard to reconcile. And my guess is the court just didn't think about it very much. But it does create this weird split where a social media company or whatever category of hosts, we think are affected by this decision, they have far greater responsibilities than a search engine does.
Alan Chappelle
So, and that for me raises a question. So my understanding is that one can be a joint controller, and that does not mean that the multiple entities who are joint controllers all have identical processing responsibilities. But it seems like the Court of Justice said, oh, in fact, you are joint and severally liable for everything going on under these processing decisions. And it feels like they could have just sidestepped this entire thing by just saying, no, we understand joint controllers. You know, you might only be subject to 10% responsibility or for these processing activities. And like, that's not a leap. And I don't even think that requires jazz hands. It's a, it's, it's written into the, into the gdpr. And I'm just curious if you have any thoughts as to why the Court of Justice didn't go down that route.
Professor Daphne Keller
Well, I mean, so there are these provisions in, I want to say Article 82 of the GDPR about the responsibility of any of the controllers in the picture that I think make it hard to divvy up the damages that way. But I, I can't pretend to be an expert in that step of the GDPR analysis. So there are, you know, limits to how much any one of two joint controllers can avoid liability. But kind of to your point, also, each of them is supposed to do the appropriate thing in light of the kind of processing that they are doing. So the particular steps or measures that you would expect each of them to take are going to be different based on the kind of activity that they are engaging in.
Alan Chappelle
I guess that's just one of those things that hopefully at a later date, the Court of Justice maybe clarifies their thinking there, because I'm starting to see that in an ads context, by the way, where certain ad tech companies, I think the Netherlands and even the Kinnel in France have held a website publisher and an ad tech company as joint controllers, you know, joint and severally liable or responsible for the entirety of the processing activity, despite the fact that oftentimes the ads company has no ability or influence into the consent obtained or the notice provided. And so to me, and again, I'm, I'm very much an ads guy here, that seems like maybe not the right or most productive road to be going down, but it does seem to be happening in multiple places. So I thought worth noting.
Professor Daphne Keller
Yeah, that is interesting. And with a, you know, a website operator and an ads company, the information they have is so different and the role that they play is so different. It's hard to really merge those two into one set of legal responsibilities.
Alan Chappelle
Okay, so as a joint controller, RUS Media had an obligation to identify ads that contained special category data under GDPR Article 91. So I have two questions. How does this work in practice? And then, well, let's just start with that. And any guesses?
Professor Daphne Keller
How does this work in practice? I have no idea. I mean, part of, I think what drives this decision is that the post it issue is so harmful and so offensive, and they think it should have been so easy to recognize that it was a violation of this woman's rights that the court is kind of assuming, oh, you know, you could glance at it and know in a second that this violates the law. This is manifestly unlawful. But realistically, given the sheer scale of content that users are posting, even to a relatively small site like this one, reviewing it all is not very feasible. Using automated tools to review it all, which is quite standard now, is fine. If you are doing something kind of technically crude, like finding duplicates of an image that you already took down once. There are relatively straightforward tools for that. But what if you go down the road of like, using text detection to try to pick up which ads might have personal data violations? You're just going to have constant false positives and constant uncertainty about what might or might not violate the law. So I don't think there is a way to meaningfully review for and spot this kind of legal violation at scale in a way that a small company like this can afford to even try. Nor is there a way to do it in a way that will avoid, like I said, a lot of false positives or a lot of errors and impact on the speech rights of the platform users and the data protection rights of the platform users, which is exactly the reason why, or some of the reasons why the court has rejected obligations like this in the past. There's a whole string of cases saying that under both the statutory language from the intermediary liability laws, the E Commerce Directive before, and the DSA now, that statutory language precludes proactive monitoring obligations. But also it's because of considerations about the rights of the users that that kind of obligation is really fraught and often a very bad idea.
Alan Chappelle
Okay, now I'm going to ask my hopefully less hard second question, which is, how is this not a general monitoring requirement? Because the AG touched that too. It doesn't make sense to me.
Professor Daphne Keller
I have no idea. And there is a sentence where the court says this is not a general monitoring requirement that may not be Verbatim, but it's pretty close to that. And I just, I don't even know what they mean because they have also said that it is an obligation to check things to make sure they're not illegal. So what is it? I mean, just even grammatically trying to parse what that sentence means, they could mean we are redefining what counts as a general monitoring obligation. They've already done that once before. It used to be that the case law strongly implied or stated that it's general monitoring if the platform has to look at every post from every user. And then a few years ago in a case called Glebeshnik Picek, the court implicitly backed off from that and seemed to be saying, oh no, it's general monitoring if they're having to look for any one of a number of potential violating posts. But it's okay. They said in that case, if a court has ruled that a specific post violates the law, then having a platform have to go look for duplicate or equivalent posts, that's okay, that's not general monitoring. So are they now saying, oh, by the way, also, this new thing that REST Media has to do also doesn't count as general monitoring and we're redefining what those words mean? I have no idea.
Alan Chappelle
Well, I'm going to ask another one then. So what specific pre publication checks are now required for platforms? Is identity verification required? Content screening? So, like what? What do you need to be thinking about if you're operating a platform in the EU in the wake of this decision?
Professor Daphne Keller
So let me start by saying a lot of smart people disagree about quite what this ruling means in kind of surprising ways. There are maybe three or four points where I'm seeing disagreement, but one of the important ones is which platforms does this apply to? And some people are saying that because this is an advertising platform, this ruling only applies to ads. I don't think that's right, because there's a passage in there about any host that's operating for commercial purposes. Some people are saying it's only for marketplaces. Like, I don't think there's necessarily something in the decision that lets you restrict it that way. But I do think national courts are going to be trying to walk this back and trying to bring this in line with the rest of EU law on intermediary liability. So it may be that lower courts will find ways to narrow which platform it applies to. But if you are in the category, it applies to whatever that turns out to be, or certainly for advertising, they want the platform to verify the identity of the advertiser. So to protect data protection rights, we're going to gather ID information from every single person who posts an ad on Craigslist. Good, that sounds very balanced. And then also they are supposed to vet the content of the individual ads. And there's a third question in there about whether it's Rus Media's fault that third party websites might have come along and copied the ads and shown them. It's unclear if Rust Media licensed the ads out to the third party sites or whether the third party site scraped them. But for that, they say Rust Media also has an obligation to try to prevent that from happening too. Which is, as I'm sure you know, is quite challenging, particularly since websites often want to be scraped for purposes such as appearing in web search results or if you're an advertiser, often you're quite happy to have your ad appear for free on some other places. So, you know, there are lots of situations where the advertiser and the website, or the content creator and the website both want the content to replicate out more widely onto the Internet. But the court is saying no. Also, you need to do policing of that.
Alan Chappelle
And is the court drawing distinctions as between search indexing, AI indexing, you know, all those things that are sort of, you know, all of the, the. Well, not all of them. The legitimacy of all of these are, I guess, under some debate at this point. But there's a whole bunch of different reasons why, as you just pointed, is the court distinguishing from reason A, B, C and D?
Professor Daphne Keller
Oh, no, no. Oh, no, no, no, no. There is no mention of or acknowledgement of all of these other nuances about even hosting, much less scraping and data collection. In fact, one thing that's funny about this ruling, at least in English, is that the court keeps referring to the website as a marketplace, which to me suggests like Amazon or Etsy or a place where transactions are completed on the platform. And as far as I can tell, this really is more of a Craigslist situation. It is an advertisement listing for transactions that take place off platform. I don't know for sure, I don't speak Romanian, but there's even that level of ambiguity about what kind of website we're even talking about.
Alan Chappelle
What are the options for a company who's, who's looking at this? Maybe you're an ad supported website, maybe you operate a marketplace. Like, what are your options right now in terms of trying to comply with, with what the EU Court of Justice has just sort of thrown down?
Professor Daphne Keller
I think there's going to be a lot of hunkering down and hoping the other guy gets sued. You know, like eventually there will be member state rulings clarifying some of this chaos, or at least clarifying it at a national level and again creating a patchwork of rules, which is exactly what these EU level laws are supposed to avoid. But, you know, in, in the meantime, it seems likely that the biggest platforms are the likeliest to get sued, because even in the EU there's money to be made in that. I was just talking to an Irish lawyer who said that he thought that the damages in cases like this in Ireland would be considerably higher than the €7,000 that this plaintiff won in Romania. And so his prediction is expect litigation in Ireland. And in a way, you know, for the public interest, maybe it's better if it's big, well lawyered platforms who get sued so that we hopefully have good information presented to courts to, to try to, to sort this out. I think some platforms might try to commit to one of the narrowing interpretations, like as I mentioned, arguing this is only for ads or only for marketplaces. You could try to adopt a narrowing interpretation that said, well, this was personals ads, so it was foreseeable that there would be data protection issues. And so maybe the monitoring obligation is really just about aspects of your service where it's particularly foreseeable that this kind of abuse might happen, either in terms of adopting a narrowing interpretation of the case or in terms of channeling your resources toward the places where you can control further risk. I imagine that there are platforms just looking at what parts of their business they can reallocate resources for to try to prioritize and catch things like this. But every time that happens, it's taking resources away from somewhere else. Right. Every time there's more you have to do for copyright, platforms are literally doing less for child abuse. And so, you know, hopefully in, in this case, there isn't some really meaningful loss of other priorities like that. But the reality is for a small company that suddenly finds itself on the hook for this, they can only do so much.
Alan Chappelle
So there is definitely a silver lining for small tech for little tech, in that it's usually big tech who ends up bill to help sort this out, but usually the media.
Professor Daphne Keller
The defendant here seems pretty small tack.
Alan Chappelle
Yeah, fair enough, fair enough. But I think now that the court of justice has sort of kind of thrown out this, this thing into the marketplace, it's going to be up to big tech to solve it. The challenge with that is that you can't necessarily expect that the interests of big Tech and Little Tech will always be aligned. And so the underlying rule set that gets created here as they try to create Clarity, may inadvertently, whether it's intentional or not, create landmines for the folks within Little Tech and for the, you know, just smaller ad supported websites.
Professor Daphne Keller
Absolutely. And one way that happens in litigation and in the way that legislation gets made is that courts and legislators look at what's possible for Meta to do or what's possible for YouTube to do, and they just project that on the entire ecosystem and design the obligations for everyone based on what these couple of giant companies are able to do.
Alan Chappelle
Well, this has been a fantastic discussion, Daphne. I really appreciate you coming on. I'm going to end with probably the hardest question that I've thrown at you and I throw a whole bunch of them at you. How long does this take to work its way through the EU regulatory and courts environment?
Professor Daphne Keller
Oh my goodness. Well, I mean my article predicting this came out in 2018, But in this case, you know, this is, I was predicting a problem that was a little hard to spot. This case has made it extremely easy to spot. And you know, some people sort of one kind of smart money way at looking what the court of justice is going to do is that they are very invested in the European project of having a unified EU wide set of laws that works. And so if they're getting blowback saying this is making the EU wide unified set of laws not work, that might make them particularly interested in fixing it. They can only do that if a lower court decides to refer up a question, though. It's not like the US where the parties can seek Supreme Court review. There's a case in Germany right now, it's against Meta. It is brought by a politician who was finding false claims about her maiden memes on Meta. That raises some very similar questions. And the court there stayed the case pending the outcome of RUS Media. And so, you know, conceivably something could get referred up to the CJEU really fast coming out of that or some other case that was already ongoing. But you know, again, it depends on, well, it depends on what the parties do. I saw one person predicting Meta would just settle that case now. And it depends on whether the lower courts want to get an answer from the cjeu, which sometimes they don't really want that.
Alan Chappelle
Is there a world in which the lower court wouldn't want some more clarity? I can see why Meta couldn't do the math and say, well, you know what, we're better off just settling. But I would think that the courts would want to get some clarity, particularly given what's been thrown at them.
Professor Daphne Keller
I wish I believed that and some of them do, but I, I definitely think sometimes either courts are just very focused on the person in front of them who's been harmed and they're outcome oriented about getting the resolution that they think is like morally appropriate for the the person who's been harmed. Or they like their national law and don't want some EU court telling them that their national law is wrong. Not only do they like it, they it's easy for them and they already know it.
Alan Chappelle
I hadn't thought about that. There's some risk to getting to getting the wrong answer. And I remember something from law school. You don't ask a question unless you know the answer. And right now I don't know that anybody has any confidence what answer that the Magic 8 ball that is the CGEU is going to is going to come back with.
Professor Daphne Keller
Yeah, I really don't think anybody does.
Alan Chappelle
Professor Daphne Keller, thank you so much for coming on the podcast. This was an absolute pleasure.
Professor Daphne Keller
Thank you for having me.
Alan Chappelle
That was a great discussion. I feel like I now have a better understanding of the underlying tensions as between the GDPR and the safe harbor provisions of the EU's E Commerce Directive. The big thing that jumps out at me in the RUS Media case is the concept of joint controllership under the gdpr. We first started running into trouble with that concept back in 2018 when the court of justice held that the administrator of a fan page on Facebook is jointly responsible with Facebook for the processing of data with respect to visitors to that page. I'm pretty sure that was the point where Daphne and others started to suspect that there would be an issue. And I can remember at the time thinking that that 2018 decision didn't really reflect the data flows. And in my view, different entities that are each involved in a distinct aspect of a particular data flow should have a distinct and different obligations under data protection law. And applying a one size fits all approach to joint controllership ignores how most of the digital media ecosystem functions in practice. I suspect that the EU Court of Justice will eventually come back with a clarification of what they mean by general monitoring of websites, and perhaps in that clarification, they'll clean up some of the requirements they've placed on websites that just don't make sense. But I'm still concerned that this will take a couple of years to get to that point, and in the interim, the marketplace is going to suffer. In all due respect, that's just not a great way for the EU institutions to operate. This probably won't be the last time we discuss these issues here. We've got a bunch of other fantastic guests coming up on the Monopoly Report podcast over the next few weeks. Please subscribe to the show@monopolyreportpod.com or on Spotify, Apple, YouTube, or wherever you listen to your podcasts. And thanks for listening.
Audiohook Announcer
Thank you for listening to the marketexture podcast. New episodes come out every Friday and an insightful vendor interview is published each Monday. You can subscribe to our library of hundreds of executive interviews at Markitecture tv. You can also sign up for free free for our weekly newsletter with my original strategic insights on the week's news at News Market tv. And if you're feeling social, we operate a vibrant Slack community that you can apply to join at adtechgod. Com.
Title: Did the CJEU just break the Internet?
Host: Alan Chapell
Guest: Professor Daphne Keller (Stanford Law School, Platform Regulation)
Date: December 24, 2025
This episode dives into the December 2025 ruling by the European Court of Justice (CJEU) in the RUS Media case—a decision that could reshape how online platforms approach liability for user-posted content, particularly in the context of the GDPR and the longstanding EU safe harbor for intermediaries. Alan Chapell invites Professor Daphne Keller to explain what the decision means for digital media, ad tech, and user rights, dissecting how the court's move upends familiar structures of platform immunity, and how this could impact businesses both large and small.
[03:32 - 06:56]
[06:56 - 10:59]
[11:46 - 14:46]
[15:04 - 17:42]
“The AG as supported by the Commission was sort of able to sidestep the iceberg that the Court of Justice went straight into by saying ... it's a processor type of a role. And where the court of justice said no, it's a joint controllership role. And that's really the crux of it, isn't it?”
— Alan Chapell [17:19]
[17:42 - 21:56]
“... The court is saying if you are a host ... and you're a controller, you don't get that sweet deal that Google got. You have to examine everything from the minute it's uploaded …”
— Professor Keller [18:22]
[21:56 - 24:32]
[24:32 - 26:17]
[26:17 - 29:01]
“So to protect data protection rights, we're going to gather ID information from every single person who posts an ad on Craigslist. Good, that sounds very balanced.”
— Professor Keller [27:17]
[29:01 - 33:26]
“I think there's going to be a lot of hunkering down and hoping the other guy gets sued.”
— Professor Keller [30:20]
“… you can't necessarily expect that the interests of big Tech and Little Tech will always be aligned. … [clarification] may inadvertently … create landmines for … smaller ad supported websites.”
— Alan Chapell [32:54]
[33:48 - 36:43]
“... right now I don't know that anybody has any confidence what answer … the CGEU is going to come back with.”
— Alan Chapell [36:25]
Professor Daphne Keller:
Alan Chapell:
| Timestamp | Segment/Topic | |-------------|------------------------------------------------------------------------------| | 03:32–06:56 | Why intermediary liability shields matter: privacy, innovation, speech | | 06:56–10:59 | Historical approach: notice-and-takedown, no “general monitoring” in the EU | | 11:46–14:46 | RUS Media case facts and procedural background | | 15:04–17:42 | Advocate General vs. CJEU on “processor/host” vs. “controller” | | 18:22–21:56 | Consequences of being a “joint controller” for hosts | | 21:56–24:32 | Feasibility and risks of proactive pre-publication checking | | 24:32–26:17 | General monitoring ban questioned | | 26:34–29:01 | What platforms must do post-decision: identity screens, ad vetting, etc. | | 29:01–33:26 | Impacts on litigation, small vs. big tech, likely business responses | | 33:48–36:43 | How long for clarity? Why lower courts might delay seeking answers | | 36:54–END | Alan’s closing assessment and concerns |
This episode offers a deep, candid look at how a single CJEU decision disrupts the balance between privacy law and platform liability in the European digital ecosystem, exposing unpredictable risks for any site or service hosting user content or ads. The hosts emphasize the uncertainty now facing operators, regulators, and users—an uncertainty that could persist for years, especially if national courts slow-walk clarification.
Recommended for: Lawyers, tech company executives, compliance officers, and anyone interested in the future of online speech, privacy, and platform regulation in Europe and beyond.