Loading summary
A
One of the most significant conversations in the last decade about the creator economy just happened in a courtroom in California.
B
All right, we have breaking news for you. A jury in California has just found tech giants Meta and YouTube liable in a landmark social media addiction trial. The jury found both companies liable for
A
negligence and failure to warn, saying that
B
these companies did design their apps to be addictive for young kids.
A
This is a real bellwether case which was brought by a now 20 year old woman. She's 20 years old now, she's only identified as KGM and she had accused the social media companies of creating products as addictive as cigarettes or digital casinos. What this says is that social media sites or apps can cause personal injuries. So look out for similar cases in the future. This is enormous news. So for all of us who have built a career in the creator economy or uploading to these platforms, this can have significant impacts on what happens next. So we're going to talk about what happened in this case. We're also going to talk about what this could mean for the future of the creator economy. Welcome to this episode of the Colin and Samir Show. All right, so let's just bring everyone up to speed. If, if you missed this in the news this week, let's bring up to speed with exactly what happened and what we know. So there was a plaintiff, goes by the name kgm, we don't know her name, but began using social media at age 6 and basically came forward with some alleged harms of anxiety, depression, body dysmorphia and self harm ideation based on the platforms Instagram, Facebook, YouTube, TikTok and Snap, basically suggesting she had been caused harm by, by these platforms. In the trial, the platform features that were called out were infinite scroll, algorithmic recommendations, beauty filters and engagement loops. And the jury findings were that the platforms were negligent in their design and operation. They had a failure to warn users and the platforms were a direct causation of harm. Importantly to note that TikTok and Snap actually settled before the trial. They, they settled this before it actually went to trial. So it was just YouTube and meta
B
and I think avoided a lot of the press.
A
I think they did, yeah, we're talking about them, but not a lot of people are talking about them. The headlines are YouTube and Meta, which the settlement shows that they knew this would be a headline that they were a part of and they didn't want to be a part of it. So the damages that are paid are $3 million to the plaintiff. 70% of that is paid from Meta and 30% of it is paid through YouTube. The plaintiff's lawyer said after the jury decided this is that they failed to operate their platform in a reasonably prudent fashion, and that was a direct cause of harm. So this is a really big deal. This is a big deal because for the first time ever, the platforms are actually being held accountable for their product design, not just the content that's on the platform. And the suggestion here is that they
B
are inherently addictive and that they are responsible by way of their design for personal injury. That's pretty intense. That's, you know, akin to, I know we'll get into this, but, you know, cigarette companies being taken to trial and having it come out that, oh, they knew that they were harmful and addictive and that they were causing personal harm.
A
So, yeah, and to be clear, where we are right now is that this is what the jury found, but it still needs to be upheld. If this is upheld, that will come with a whole other series of consequences. Meaning, like what, what the ripple effects will be if this is upheld. The platforms will likely appeal this right now and to have a lot of back and forth.
B
But this is very significant because. And I learned a new term. Okay, this was a bellwether.
A
Yeah. What does that mean? Everyone's saying bellwether.
B
Everyone's saying bellwether. So this means that this is basically a test case. There are thousands of cases like this that are essentially pending, and this one now sets the stage potentially for how the rest of these cases will go and how they can potentially also win over the platforms.
A
Yeah, I mean, basically, in law, you're just looking for precedent, right? You just want to be able to refer to a case. And this is a pretty big referral and a bit of a, like, potential, like $3 million. Is everyone listening to this is probably like, okay, Meta and YouTube have to split a $3 million payment. They're fine. But it sets a precedent that basically everyone can say, hey, look, your platform caused me harm. Right?
B
Yeah. There was a slightly similar case the day before in New Mexico against Meta. And the payout there was 375 million. And, you know, the reality is too, that this could result in major product changes, major changes in the way that these tech companies have to regulate what's uploaded to their platforms, which could largely affect their bottom line. Like, sure, 3 million for them, whatever. But it could be the beginning of a major, major, major shift.
A
So for years, because this has been a conversation for a long time, if, if everybody's read any of author Jonathan Haidt's books or his most recent book, he talks about this. A lot of people have been anti social media platforms. What was that movie during the pandemic? The Social. Social.
B
Social Dilemma.
A
Social Dilemma. Right. So that was a movie that basically talked about this exact same thing. It's like, hey, look, all these platforms are built for addiction. Like, these are not for good for kids. But the platforms have protected themselves from the liability of that whole conversation with something called section 230. And this case shifts the whole legal dynamic away from being able to use Section 230 as a protection. So I think you should explain what section 230 is so we can talk about that.
B
Okay. Section 230 of the Communications Decency act of 1996 basically said that platforms are not responsible for what users post. So if the videos are harmful, it's not on the platform. Traditionally, publishers are responsible for everything on their site. The New York Times looks at every article before it is published, and so they are responsible for what they post. And tech platforms were sort of granted immunity through section 230. There was a part of it as well. There is a part of it called the Good Samaritan provision, which gives these platforms flexibility to remove content that they feel doesn't fit their guidelines. But also in doing so, it doesn't mean that just because they are filtering some content, they are now responsible for whatever they don't catch. It says, like, you go do what you think is best, but we're not really going to monitor that now. If this case goes through and this becomes precedent, that now all of these platforms are responsible for all of the UGC that's uploaded. And we've said it on some of our past episodes, the numbers of content, the numbers of video and the hours that are uploaded every day are truly absurd. If they are responsible for that, they could end up having to make some major changes, mainly heavily censor some of the UGC content, which would totally change what they do, or move to a place where they don't even allow UGC at all. Like they become. Let's paint a picture where YouTube, essentially, they say they're a streamer. That was part of their response here.
A
Yeah. YouTube said, we are just TV.
B
They said, yeah, we are not a social media site. They called it. We are a responsible streaming service and we can get into whether we agree with that or not. But they could move to a place where they get really, really strict about the creators that are allowed to upload to their service to their platform.
A
Yeah. So that's insanely significant because we are ugc. You and I are ugc. We're not like a legacy media publisher. We're just users like any other user on the app and uploading content. So when we talk about 20 million people are uploading videos every day, the majority of those are just like us, just independent creators uploading videos to the platform. The minority is the Fox studios and, and Disney's and you know, Warner Brothers and all those. Or the NBA, like the, the, the corporate media publishers uploading to the platform. That's the minority of it. So the YouTube. I would say, I would argue back here that YouTube is, is not. I mean, obviously they didn't win this argument. YouTube is not really TV, right? Like, like television never had 20 million people from anywhere in the world uploading videos every single day. Like that. That, that's just not what it is. Netflix is more like tv.
B
It's a curated, it's editorial.
A
Curated editorial. Look at what Netflix thinks should be on their platform, YouTube. We've always said this. YouTube has no idea what's on the platform tomorrow. So it's not, it's not TV.
B
It looks and feels like TV.
A
Yeah, but it's not TV.
B
You can watch some of what's on TV on YouTube, but it is for the better, I believe. Not TV. It is a place that is democratic, where all of the independent creators like us and the people that listen to this show and watch this show have made careers for themselves.
A
I don't think YouTube is like the others. I don't think YouTube is like Instagram. I don't think YouTube is Like TikTok. I don't think it's like Facebook or. I don't know. It's not, it's not. It doesn't feel the same as like when we think of. We put this in our chatting about this with the published press team and the Slack channel and one of our writers said, when I open Instagram, it feels like I'm hitting a vape. And I was like, whoa, yeah, I know what you're talking about. Like I know what the feeling is. And like maybe TikTok is more like like drinking a Four Loko or something. But there's a feeling to it and we're older. Like we existed without it. So we, we are aware of it. But if you only existed within this, it is pretty crazy. Yeah. I.
B
Part of the reason why YouTube is not like the others is because it is a click and watch platform.
A
Yeah.
B
For the most part, sure, there is a shorts feed, but it does not open directly to the Shorts feed. And historically, YouTube has been horizontal videos with thumbnails that you have to intentionally click on. Now, they do have Autoplay, which was a part of this case, but I do agree that it is not like the others.
A
Yeah, it is not like, I think YouTube is something fundamentally different than the others, but I don't know how to explain it to, like, a judge why it's so different. But I think, could I argue that,
B
like, there's no way someone could result in personal harm or injury from watching it? No, I couldn't argue because I think
A
the issue is the conversation is about how the platforms engineer attention, not, like, just the content, but how good they are at going, I bet you you'll like this. I bet you you'll like this. And when that happens, you can just go down rabbit holes and then based on what the content is, you can end up in a bad place because YouTube is not, you know, they do a good job of, I think, curating, or not curating, but having guidelines of what you can and can't upload. But you're going to end up in some weird rabbit holes if that's the direction you're going.
B
And I think the problem is clearly the delivery device, because a TV in a public setting where you have to click and watch something is very different than how encapsulating a phone is and how it's always in our pocket. It's always at our disposal. We were also talking in Slack, our writer Sid was saying that, like, it's really hard for her if she wakes up in the middle of the night to not open up Instagram immediately.
A
Wow. So interesting that, like, Apple is not part of this.
B
Yeah. I mean, look what Tim Cook said. Did you see what he said on Good Morning America?
A
No, what did he say?
B
Very recently, I think it was last week, he was asked about, you know, the impact of the iPhone on society. And Tim Cook said, well, I don't want people using them too much. I don't want people looking at the smartphone more than they're looking in someone's eyes, as if they're just scrolling endlessly. This is not the way you want to spend your day. Go out and spend it in nature. And that's the guy who sells the device. Yeah.
A
I mean, isn't. Isn't. It feels like the world of technology adopted a word for addiction or an addictive thing, which is sticky, like to make a sticky technology.
B
Right.
A
And it's a softer word for, like, it's really addictive. And it feels like the goal has been a relative addiction. Right. That has been the marker of a good product in technology is do people come back every day? And addiction is used as a very negative word there. But meaning like the gym can be addictive. You go every day, you get. It's habitual. Habitual, addictive. Like there can be. There's a bit of, a bit of
B
thing is the blurry lines only does so many technologies. And of course the gym is good for you, but the gym only does so many things. The phone now does everything.
A
Yeah.
B
And is constantly available. Like I.
A
So here's the question. Like, should the platforms. I think the biggest question is like, should the platforms be responsible for how you, how long you stay on the platforms and how long you're allowed to stay on the platforms? Or like, because I know YouTube has this new like time limit thing. Have you noticed that? Well, here's like, hey, you've been on for a long time. You should get. Or maybe you haven't been on for as long as I have.
B
I haven't been on it for as
A
long as I have. It gets me to like, hey, dude, you've been on for too long.
B
I don't know. My phone just tells me when it's bedtime and I listen to that. So here's what is happening though. There are warning labels similar to cigarettes that have been instituted in four states, California being one of them, where if a Minor spends over three hours on the app for 30 seconds, a label comes on screen that covers 75% of the screen and talks about the mental health effects.
A
Really?
B
Yeah.
A
Where is that?
B
Which I didn't realize.
A
I didn't know that. Where is that?
B
That's in California. New York, I believe in two other states.
A
Wow. So there's an obvious parallel there to the tobacco industry. Right. And if we look back at the 1998 tobacco settlement, there was a 206 billion dollar payout from tobacco companies and they then were hit with marketing restrictions, especially to minors, which is what you're talking about. And like an industry wide shift in how they operated. I remember this because we were like just about 10 years old. Do you remember this? Like this whole thing around like smoking 1998. Like I remember when this happened when smoking had to get less cool. And all of a sudden we were hit with these ads that were paid for by like the tobacco industry of like people having holes in their throat and yeah, truth and smoking out of their necks.
B
And it started disappearing from all of our TV shows.
A
It was terrifying. Those ads freaked me out. And a $206 billion payout from the tobacco industry. This was really, really significant. And basically the case in 1998 reframed smoking from a choice to an engineered addiction. And that's kind of what's happening here, suggesting that social media is not a choice. Yeah, it is engineered addiction.
B
Because also this was the first time in history that a jury heard testimony from executives, as well as saw internal documents about discussions around the engineering. So that is also very similar to what happened in tobacco, where it was, like, internal documents and hearings from executives that turned the jury.
A
Yeah. So I think realistically, like, we may be in really early innings of a long legal and regulatory cycle that's about to happen, where it may end at a point where, like, Instagram and, like, the social media coalition or, like, all these platforms have to bind together and put money into something that runs ads about how it's dangerous to be on social. Yeah.
B
So let's talk about, like, where this could net out. And I think there's, like, there are so many parallels to what happened in cigarettes. One, we've already talked about warning labels, whether that's for minors or just anyone who's opening the app or spending time on the app, which I actually think would be effective if every time you opened up your. One of your social media apps, it told you that this was harmful to your mental health. Age gating, which is something that's already taking off, where you have age verification and you have to be above a certain age, similar to cigarettes, to use the product. A polluter tax, which is kind of what you're talking about, where they have to put money into solving the problem. Right. And then another thing was no marketing to minors, which is not a big problem that social media faces. Minors are all pretty much bought in.
A
Yeah, they're all bought into it, for sure. Just from a morality perspective, there's probably something that does need to happen here. Even for me as, like, a new father, I think about this a lot. I watched the Inside the Manosphere documentary on Netflix, and I was like, oh, man, this is not. This is not good.
B
Like, what's not good?
A
What's happening on the Internet is not good. And we're talking about this from the consumer level right now of, like, consuming this content. The incentive structure of the creator space has produced some really amazing stuff. It's also produced some pretty crazy and bad stuff, because the incentive structure is like, if I can capture attention and ride the wave of these platforms to capture as much attention as possible. I can then sell people a dream or sell people something. And if you haven't watched the Inside the Manosphere Netflix doc, Louis Theroux is fantastic, but it's eerie and it left
B
me feeling very conflicted. If you haven't seen it, go watch it. But it covers these different creators who speak predominantly to young men. They're very misogynistic. A lot of them are promoting gambling websites, scams. Some of them are running like onlyfans rings, almost as if they're like modern day pimps. Like they're just really bad influences on society. And one of the weird things, watching it, all of them had their YouTube play buttons up on their walls and I felt so strange that they valued something in the same way that I do. Yet we share none of the same values that like we all, and maybe we do share more values than I thought of. Like trying to get viewership, trying to get this play button attention, trying to build a business attention validation.
A
Yeah, I, I mean this continues to pull on the thread of YouTube as an editorial platform in the future. Right. Like, because the reality is, as we've been having these conversations around like Netflix and where Netflix is fitting into the creator economy, Netflix is looking probably like a star right now in the, in the eyes of anyone concerned about social media.
B
Right.
A
Like, I was thinking about the concept of like even the dude perfect app, great example where like you want a trusted space now for your kids, for your friends, for whatever you want. Like a more trusted space on the Internet. And Instagram, TikTok, these are becoming harder to trust for like letting, letting someone you love into those environments. And so Netflix is actually a very trusted environment in this context. Right. Like it's, it's very editorial and I would argue YouTube is trusted. But I also, my YouTube feed is really chill. Yeah, it's like, like some sports, some like conversations around media and then like gox is like cool creative art projects. Like it's. Yeah, it's cool and it's. I love it. I think it's a great feed. My Instagram is way crazier and my TikTok's way crazier.
B
But I feel like YouTube in the future will just be a slightly stricter place for ugc. Like it won't be fully editorial.
A
Yeah.
B
But we won't see a world where a creator like Sneako who was highlighted in the Manosphere doc was banned from YouTube and then unbanned from YouTube now.
A
Right.
B
Like, I think we will see a world where people potentially and this gets into free speech. But like, our. From their first video, no longer allowed to continue uploading to the platform. Because YouTube has just said, yeah, this is our land. Right. We decide whatever.
A
Yeah.
B
We want goes on it.
A
Right.
B
And so there will be certain communities potentially and topics that just don't exist on the platform. And people will be uncomfortable about that.
A
Yeah, totally. But then there will be other platforms that emerge that offer that. Right. Like, it's not. It's just going to be more. Everything's going to have its own tone. But also the question is, what would force the platforms to change? I don't think it's lawsuits. I don't think it's even money. It's not money. They can pay millions of dollars in legal fees. It'd be regulation. If there's actual regulation that takes place, that the government's like, okay, this is enough. You now have to actually do this to exist.
B
Well, money will make them change in terms of the bottom line of the business.
A
Yeah, but it would have to be so significant to impact that.
B
But I'm saying more so from product changes.
A
Yeah.
B
You know, if they truly had to reduce the amount of content and stop serving a lot of people and.
A
Yeah. But then all of a sudden you're putting into question, like, now that we're used to this, do people want a worse version of the feed? Right. Like, does somebody want the infinite scroll to stop on Instagram? That's probably the more responsible thing to do that every hour it, like, stops and you get to the end of a feed if you scroll for too long and it's like, okay, there's no more content. Come back in an hour.
B
I'll tell you what, I want it, but I'm not going to be the one to turn the lights off.
A
Right. Yeah.
B
You know what I mean?
A
I don't know. Yeah, I don't know.
B
Like, I would prefer a version of the world with way less infinite scroll, but there's no world where I'm also, if I'm given the opportunity to make that change, like, for myself, I. Maybe that's because I'm addicted.
A
Yeah, of course we're all addicted.
B
I think it would be hard.
A
We're all addicted. Of course we are.
B
You know, it's funny, when someone in a social setting says, oh, I don't have Instagram, or I quit Instagram, the logical response is, good for you.
A
Yeah, but I think that's probably. I mean, I'm not a smoker, but among smokers, I bet you it's the same thing.
B
Yeah, but we're talking about, like, tobacco, which can give you, like, lung cancer.
A
Yeah.
B
And the platform that we all upload all of our videos to, all of these platforms, Instagram, it's very different.
A
Yeah.
B
And people say the same thing. Good for you.
A
Yeah. Yeah. It is crazy. I think the big shift now from platforms and creators is going to go from capture attention at all costs to capture attention, comma, responsibly. That's the new tagline of the creator economy. It's not as sexy. It might not be as big of an industry, but I guess that was also maybe relatively Hollywood. Capture attention responsibly. Like, you has to go through so many filters to, like, finally make it to theaters and.
B
Yeah.
A
I don't know. Like, I do think, like, even the. This is on the positive end of the spectrum, but, like, my upbringing, I was extremely influenced by media. I was aware of how influenced I was by media, which is why I wanted to get into it, because I was like, whoa, if you want to shape culture, if you want to have an impact in the world, it's through media. And this is like, definitively, media is how people make decisions in their life. It's how they're impacted. So what this case is about makes sense to me that you can be influenced in a lot of different directions by media. But it is interesting that, like, no, it is the technology in which the media is distributed that is under question here and under penalty compared to, like, there's also TV networks that put out, like, crazy programming, too, or movies that are kind of crazy and, like, whoa, that spreads a crazy ideology or. Right. But that. That never gets put into question.
B
Yeah. Because I.
A
Well, I guess maybe it does. Maybe studios gets.
B
I think it gets put into question, but potentially the engineering of it doesn't get put into question. You know, the impact isn't as vast as social media and the phone, the smartphone.
A
I'm curious to see what happens here. I'd be curious for you guys thoughts if this does go any further. I mean, it's possible we could see changes to YouTube. We could have to submit a video a day early for, like, a quality check.
B
Yeah.
A
Before it gets published. Right. It could sit in a waiting, Processing, waiting room.
B
You could have to verify your age before you open up or upload to one of these platforms.
A
Yeah.
B
There could be limits to how much time you could spend. There could be warnings that tell you about the impacts.
A
Somebody pitched this on Twitter. I think it's such a good idea that there should be business hours for social media, that it should Just shut down.
B
Oh, yeah, like it's, it's like nine to five.
A
Yeah, it's nine to five. Just shut down. And you can't use it when you go home. Yeah, that'd be so good.
B
It'd be great.
A
That'd be so amazing that you wake up, it's still not on until nine.
B
Yeah, but sometimes I like social media at home. Like, I'll ask my wife, like, you got any good TikTok, you know?
A
See, that's, that's the thing. It is fun.
B
It is fun. It's a good time.
A
It's a good time.
B
Yeah. Sometimes I want to go to the bar and get drunk, you know.
A
Sure. But let us know what you guys think. Put it in the comments. If you're listening on Spotify, if you're listening on Apple, you can DM us or tweet at us. Colin and Samir. And last thing, if you're still here, tail end of the pod. First of all, thank you for being here. Second of all, we'd love to see you on May 28th. Press Publish LA. We have applications open right now. We have a lot of applicants and we have about 30% of our seats approved. So people are going to be in those seats on May 28th. If you want to be one of those people, go to PressPublish LA. We're going to be announcing the venue in about 10 days time. Super excited about that. So we'd love to see you there. The Hollywood Creator Summit. If you want to get off your phone, come hang out with us. Apply. All right, we'll see you next week.
Episode: YouTube, Meta and the Case of the Infinite Scroll
Date: March 26, 2026
Hosts: Colin and Samir
Colin and Samir break down the landmark California court verdict that found YouTube and Meta liable for negligence and failure to warn users—specifically minors—about the addictive nature of their platforms. The episode unpacks what this means for the creator economy, the potential future ripple effects for social media platforms, and broader cultural implications. The hosts draw parallels to the tobacco industry, discuss Section 230's shifting role, and consider if product design is about to fundamentally change for creators and audiences alike.
On the Bellwether Impact:
“This is a real bellwether case… So for all of us who have built a career in the creator economy… this can have significant impacts on what happens next.” (A, [00:26])
On Tech Accountability vs. Liability:
"For the first time ever, the platforms are actually being held accountable for their product design, not just the content that's on the platform." (A, [03:05])
On Section 230:
“Section 230… basically said that platforms are not responsible for what users post. …If this case goes through and this becomes precedent…they could end up having to make some major changes.” (B, [06:24–07:15])
On YouTube’s Uniqueness:
“YouTube is not really TV, right? …Television never had 20 million people from anywhere in the world uploading videos every single day.” (A, [09:11])
On The Issue of Addictive Design:
“The issue is the conversation is about how the platforms engineer attention, not like just the content, but how good they are at going, I bet you you'll like this… you go down rabbit holes…” (A, [11:42])
On Apple’s Stance:
“Tim Cook said… I don’t want people looking at the smartphone more than they’re looking in someone’s eyes, as if they’re just scrolling endlessly. This is not the way you want to spend your day.” (B, [12:46])
On Creator Responsibility:
“I think the big shift now from platforms and creators is going to go from ‘capture attention at all costs’ to ‘capture attention, responsibly.’ That's the new tagline of the creator economy.” (A, [24:04])
On Addiction & Social Attitudes:
“We’re all addicted. Of course we are.” (A, [23:26])
“When someone in a social setting says, ‘Oh, I don’t have Instagram, or I quit Instagram,’ the logical response is, ‘Good for you.’” (B, [23:29])
This episode captures a pivotal moment for the creator economy, as liability for addictive product design enters the legal mainstream. Colin and Samir’s discussion is both personal and analytical, blending industry expertise with candid admissions about the universal challenge of digital addiction. The stakes for creators, platforms, and audiences are enormous, promising a future of stricter regulation, evolving norms, and likely, less “infinite” scrolling.
For more nuanced takes and continuing coverage, tune into future episodes or join their ongoing discussions on social media and at live events like Press Publish LA.