
Loading summary
A
Spring starts at the Home Depot and we are bringing the heat to your backyard this season. Fire up the flavor with our wide variety of grills for under $300 like the next grill 4 burner gas grill that's perfect for hosting your spring cookout. Then set the scene and turn your outdoor space into the go to spot the patio sets for every budget. Bring it this season with grills that deliver flavor and patios that set the vibe from the Home Depot. Start your spring with low prices guaranteed at the Home Depot exclusion supplies. See homedepot.com pricematch for details. Hello everybody. Welcome to the weekly show podcast. My name is Jon Stewart and I will be your your audio guide to this week's podcast. It is Tuesday. It is March 17. It is St. Patrick's Day, and I am. Oh, that's. It's all blue, blue and gray. I'm in blue and gray. Does that signify anything? Honestly, today it probably doesn't. We are still, I guess, entering week three of our war with Iran. But I want to talk about a different threat to the country. The Senate this week, I believe, is going to be trying to figure out bureaucratic loopholes to try and get their Save America act through, even though they don't really have the votes for it, because they could never pass the thresholds of filibuster and certainly not passage. But they want to get it done because they want to introduce the safeguards to the American electorate, because so many undocumented people, so many non Americans skew our elections so brutally, even though they don't, even though there's no evidence of it other than that this is an incredibly crucial piece of legislation that must be passed. So today, what I would like this episode to be focused on is the real threat, ironically, to American democracy and our election. And that is the algorithms and social media platforms that push this bullshit and get it out into the electorate so that it becomes canon, even though it lacks the evidence for that. And so to get into that topic, we bring in experts in this field. So let's get to them now. Very excited as we talk about the real threats to the American electoral system, to the American democratic system. We're delighted with our two guests today. We've got Renee Diresta. She's the associate research professor at Georgetown McCourt School of Public Policy. You always know how good something is by how long its name is. And that's the longest name. There is also the author of Invisible Rulers, the People who Turn lies into Reality, and Casey Newton, who's the Editor of Platformer and of course you all know him as co host of Hard Fork, which is the New York Times podcast about technology and future and. Hello, Hello.
B
Hey, John.
A
What is happening with the both of you? Listen, the reason why we're doing this now. So I don't know if you're familiar with the SAVE Act. It's the Save America act. And as you know, we all. That's where we live America. And they're going to save it by, by voting on it. And what it's going to do is it's going to protect our electoral system and our democracy from the scourge. Scourge. Scourge, yeah. Of non citizen voting, which as you know, is in the. I think billions are either of you. And, and, and let's just start by sort of defining what this SAVE act is, is seeking to accomplish. Basically the idea is it's an idea that most people can, can get behind. Voting is for American citizens.
C
Yes.
A
Let me ask you both now generally, are our elections decided by American citizens?
C
Absolutely.
A
Yeah. So we have an agreement. We have, you and I, we are all in agreement. So why is it. And this gets maybe to the heart of the issue. 70% of Americans support the SAVE act because it's going to make sure that citizens vote. But generally citize vote. But 50%, I think somewhere along that line believe that undocumented voting is an enormous problem and is done in by the thousands and millions. It is. Why is that? And I'll start with Renee.
C
Yeah. So there's been rumors about that for ages. And you can look back to, I mean we can, we can go back to the 2016 election. You can go prior to that. You basically see these stories, we call them tropes in election rumor research. And I use the word rumor on purpose. Right. A rumor is something where it sounds like it could be true. It resonates with people. They think, oh, this might be something that is happening. They heard it from a friend who heard it from a friend who saw it on the Internet. Right. There's a sort of trace back to, to a claim that some guy said somewhere. And with this rumor of non citizens voting, what you hear is this theory that your vote is going to be stolen from you. Your candidate might lose because somebody who is not supposed to be doing a thing is doing that thing. So there's a sense that you might wronged and that's why it lands so hard emotionally. And if you look back, you see the same types of stories, the same types of tropes. Landing over and over and over again. You have the bust in voter, you have the person voting with their maiden name, you have the person who is here illegally voting. These are stories that recur over and over and over again. And the reason they recur is because they seem plausible, they seem believable. And most people are not persuaded by statistics. They don't go and say, oh, you know, Cato says that this is a small problem. Heritage says this is a small problem. Brennan center conservative. Exactly.
A
Cato's libertarian heritage is.
C
I mentioned those on purpose. Exactly. Exactly, yes. So when you go and you look at even the libertarian and the right wing studies of voter fraud, you find repeatedly that when, when you are looking at honest statistics, when you're looking at the actual studies of the problem, it is infinitesimally small. But when you go and you look at social media and you hear people who are sharing these stories that they relate to, that they, that feel true, that's why these, that's why these rumors continue to propagate.
A
When we're talking about social media and those kinds of things and they, and they propagate along that way. Is that happenstance? Is that because the rumor mill or how does it propagate? Why does it propagate? How does a video of an election worker in Georgia reaching under the table to pull out a bucket of votes that is not in any way nefarious become the centerpiece of these larger conspiracies, et cetera?
C
Sure.
B
Well, you know, as Renee just pointed out, there's something really emotional about seeing something like that on social media. Right. Somebody's just pulled out a bucket of votes and seems like something nefarious is happening here. And depending on, you know, what caption the sort of, you know, aggrieved user might put underneath it, all of a sudden it's going to start getting that engagement right? The algorithm is going to say, hey, this seems like it's pretty interesting. We're going to show this to a lot more people. And over time, the elites of the sort of, the Republican Party, whoever can sort of use this to their advantage, is going to say, aha, this is something that I can use to sort of make my case. And so that's. That kind of, you know, the algorithms and the elites are kind of working hand in hand to spread whatever kind of emotional rumor might serve their cause.
A
And these causes, I mean, ultimately the aim, genuine or disingenuous, of protecting the electoral system, you would consider it to be, you know, an honorable one. You don't Want. And there are times that it does it. I think there was a study done since 1982 that there were almost 1500 people, right, total. Now, you could say, like, well, in small elections, even one vote, two votes, three votes can make a difference. And, and which is true. But it's very clear that the irony of this is that the larger threat to our electoral system and our American democracy is the manner in which social media can spread these tropes and these inaccuracies to a really much wider group of people and light these fires. And is that the type of thing we're utterly ignoring, the actual threat to our democracy? Would that be accurate? Renee?
C
Well, yes. The challenge is that there's not a lot you can do about that, because the way that as kcr, this is the problem, right?
A
I thought we were fixing this today. Son of a.
C
So the, you know, I worked on a project called the Election Integrity Partnership, where we just traced rumor after rumor after rumor, and we wrote in real time what was happening, how it started, where it started, what, to the best of our knowledge, what the truth was.
A
And you're doing this. This was at Stanford University.
C
This was when I was at Stanford. This was in 2020. We did this in 2020, which was, of course, the year of stop. Right. Remember? And so as we would, we would trace these stories. And what Casey's describing is true. You have the influencers and the algorithms. But the third piece of that is the crowd, right? The online community that surrounds the influencer that believes it, that amplifies it, and that moves it from platform to platform, right? People are the glue between the online platforms. Just because one platform maybe has a policy that says, we're going to moderate this content doesn't mean that all platforms have that, first of all. And second of all, again, like I said, you can't fact check your way out of this stuff. When you try to do that, people just feel that their voices are being suppressed. If you try to silence the rumor, right? If you, if you kind of nuke it and stop it from trending, as happened occasionally in 2020, then they believe that there is, you know, they are trying to prevent you from knowing the truth. And then that becomes kind of a second order. You know, we call it the Streisand effect, right? This idea that you are actually amplifying the theory by trying to suppress it. So one of the things that you have to try to do then is counter speak against it. But problem is, oftentimes election officials, they are not influencers, right? They Are. They do not have very. I mean, let's be honest, they are
A
sometimes just male people and nurses and
B
things and shame on them.
A
They have other jobs.
C
They have, they have jobs. They have, they have elections to run.
A
They're, you know, they haven't been weaponized by dark money that goes into the,
C
the system, but they've got small followings. In all seriousness, they have small followings, right? They're out there again, they're, they're trying to put out facts. Facts do not land against emotional story. And the way that that rumor mill works, one influencer says it, another one boosts it big if true. Have you heard, you know, it's viral. By the time the guy with 200 followers is like, actually, let me tell you about how those ballots actually work. Let me tell you why this rumor isn't true. That guy's not going to get amplification. So unless the platform is actively trying to uprank and surface good information, which is something they were trying to do in 2020 and no longer are, we can talk about why that is. Unless they are trying to actively upbrink the good information is not making it out there. And then the other piece of that is that the deeply distrustful crowds that have been taught to, that the election is going to be stolen. Right? They have heard this over and over and over again, are not inclined to believe the fact check or the information that the election worker is going to put out.
A
Right? Well, we see that, Casey, and you know, look, 2020 was, was stopped the steal and there was all this fraud and the Democrats had rigged the election and we're going to have, we're going to have to get the cyber ninjas in there to figure out exactly what went. And it routed through Venezuela and China, sent in elections, and then suddenly in 2024, nope, that one, that one actually was pretty good.
C
That, that one, everybody forgot to do that.
A
That one worked pretty well. And so it really does seem to be an argument of convenience.
B
Yes, absolutely. It's an argument of convenience. I also think that another thing that's important to highlight here is just the demand for these narratives, right. In a moment in 2020, when Donald, there was a huge demand on the right for that not to be true. Once he won in 2024, that demand sort of went away. Right. And so that energy was able to go elsewhere. But I think it's really important to talk about the demand side because the algorithms, the elite, they're super important. But as Rene said, the people are. What's gluing that together. And they just sort of want certain things to be true. And in the media environment that we have now, they can just kind of go out and pick their own reality based on what they want.
A
Talk about that for a moment. So, yeah, if we want to think about this in sort of economic theory that you're saying that there is supply side, I'll say misinformation. People get it wrong. That just happens sometimes out of good faith. But the more nefarious one that's been weaponized is disinformation. So what are the elements of supply side disinformation? And then we'll talk about demand side. But supply side disinformation, what would be considered the elements of that?
C
That's where you start to see people who, again, are incentivized to seed content to try to put out plausible theories to keep hope alive, but really more importantly, to cast doubt on the integrity of the election. And that's what you saw a lot in 2020. The idea that you could just offer yet one more justification, one more reason, one more variant on a theory, one more. The reason we use rumor, actually, is because you don't even have to know what the intent is. It's just a story that is passed from person to person and resonates emotional with disinformation. We talk a lot about foreign actors who are in the mix too, right? You have these agitators, these people who are in there because they see an opportunity to advance their own cause. Donald Trump is talking about Iran. That blew me away. Because, you know, of course, disinformation was a word that we couldn't say for a period of about, you know, three, four years there. But now, now we're talking about it again because it was a thing that did in fact happen. It was not particularly major or significant in 2020 or in 2024, far less than what we saw in 2016. But, you know, you do have foreign actors in the mix.
A
And these are bots. And so they're. They're seeding the narrative with paid bots. Or is this the kind of thing where they talked about, you know, there's a 16 year old in Uzbekistan and he's being given money to invent stories that sound plausible and then ceding the turf. You know, Maria Ressa talks about this often. The idea that a lie spreads seven times faster than the truth. So these are people absolutely, with intention and purpose, sowing the seeds of confusion and misinformation. That that would be the supply side actor in league with the Paid influencers whose profiles are boosted by algorithms. Would that be accurate?
C
That's pretty accurate. So again, you have, you have a different. The accounts that are content creators and the accounts that are amplifiers are not the same. The amplifiers are. This is where you see bots usually, right? An amplifier would be the sorts of accounts that just click the like button or click the retweet button on Twitter, click the share button. And the reason is you need to have engagement in order to trigger the algorithm to share it out to more real people. So the reason that you have automated accounts, the reason that you can use fakes or rent networks of accounts that are used in commercial spam, the reason you see those accounts come into the, on the supply side, is that you need to have engagement. Something has to get those, like, counts up. And that's where you see those automated accounts. And then the, as you're just describing accounts that are actually writing the content or saying the thing, you do want to have some sort of legitimacy or trust there. And that's where again, you'll see, sometimes they'll be paid and sometimes there'll be an account.
A
That's where the, the cat turds of the world will step in and help.
C
I mean, I, I think that people sometimes underestimate with some of those folks just how ideologically motivated they actually are.
A
Oh, I don't know. I don't underestimate that at all. I, I think, I think they are absolutely purely ideological warriors, but are sometimes shaped by the financial incentives that go in there. They've become that. It becomes their identity. They start to earn money on it. Which, which brings us to the point. And Casey, you know, this will, I think, be kind of in your wheelhouse. Let's talk about the artists formerly known as Twitter. So, and this gets us to the crux of the irony. Elon Musk, for a long time and really incredibly consistently and vehemently has pushed this idea that undocumented non citizen voting is rampant. It is sowing the seeds of our destruction and we cannot do it. He's, he's tweeted about it, I think 1300 times, or interacted with stories about it. The irony of it all is that this guy's platform, this guy's algorithm, which he is in charge of, I see his on my feed all the time. I don't ever interact with it. There's no reason for it to be there. That he is a far more relevant actor in the warping of our democracy through his money and his algorithm than any measure of undocumented non citizen voting will ever be.
B
Absolutely. I mean, there's a, there was a paper published in Nature in February and they did a study where they had two groups and they showed one the sort of ranked algorithmic feed, and then they showed one group just sort of a chronological feed. And they found that people that saw the algorithmic feed on X moved further to the right than the control group by like a very significant measure. Right. So if you're actively using X, you are probably subconsciously moving a little bit to the right over time. And as you point out, John, that is just a far greater effect than what are essentially these mythical cases of an undocumented immigrant voting in one election somewhere.
A
Let's break that down because it's very easy to cast aspersions. What his argument, and I think his people's argument would be, is, well, now that we're getting uncensored material, now that the First Amendment has primacy, people move to the right because they learn the truth. But the truth is that algorithm incentivizes the misinformation from the right. And he designs it.
B
Absolutely. And like at Platformer, my newsletter, we broke the story a couple years ago that after one of his tweets did not get as much engagement as Joe Biden's during the super bowl, he went back to his engineers and he said, you need to re engineer this so that my tweets are getting more prominence. And so they did. And so that is the reason why even if you don't follow the guy and you're using X all the time, you're going to see his views which contain, you know, so many, just various, like right wing ideas and conspiracies. Like, he built the algorithm this way. And like, if you're like me and Renee and you've been covering this stuff for a decade. In 2020, conservatives were holding hearings saying platforms must be ideologically neutral, you must never suppress any sort of speech. Why aren't, why, you know, the right and the left should be equal on these platforms. Fast forward to today. Exodus are right wing political project, full stop.
A
Look, the algorithm is killing us. The algorithm, the, the, the way that it incentivizes the hostility and, and weaponizes ideology and all the, it just, it's, it's not right. But the, the antidote. The antidote is information. And that's where ground news comes in. Ground news, it's this website and app. It's designed to give readers a better way, an easier way to navigate the news. It pulls together every article about the Same news story from all outlets all over the world and puts them in one place and not, not incentivized for like the worst, most hostile, most partisan. Take it tells you where it's coming from. You. You can see starkly in black and white how these different organizations and algorithms are manipulating the information that we get. They show you how reliable the source is and who's funding it. Who's funding it. Follow the money. Know who's behind the headline. Oh, who is this Rupert Murdoch fella? He seems delightful. He seems to have a somewhat pointed view of the world. I'm telling you, man, the Nobel Peace Centers even mentioned that Ground News is an excellent way to stay informed. Noble Peace center, that's, I think, the one that Trump started. I think it 3D prints Nobel Peace Prizes. It just hands them out. The platform's independently operated, supported by its subscribers, so they stay independent and they stay mission driven. They don't get sucked into this slop. If you want to see the full picture, go to Ground News. They can help you through the noise and get to the heart of the news. Go to groundnews.com stewart subscribe for 40% off the unlimited access vantage subscription discount, available only for a limited time. And this brings the price down to like $5 a month. That's ground news.com stewart or scan the QR code on the screen. Now. Renee, they actually came after your group pretty hard. What? Yeah, Tell the story of that. So your group is studying how these things work. And by the way, later on, we'll get into the balance between. Because I do think there are First Amendment concerns with a lot of these different things, and that's maybe why it makes it more difficult to do that. But, Renee, what happened with the Stanford research group that. That you were a part of?
C
Yeah, so we ran that project in 2020, and. And what we did in 2020 was we were tracking these election rumors and we worked. We had a. We set up a tip line, right? And we sent an email to the rnc, we sent an email to the dnc, we sent it to naacp, aarp, a bunch of these civil society groups saying, hey, we want to help. Because one of the things we can do is we can trace where rumors come from, where they're going, and we can try to get fact checks out. And per the point about the election officials, they have an election to run. Their job is not to be sitting on social media trying to triage and figure out if rumors are disinformation. You know, this was the first major election Since Russia in 2016, we thought we were going to see a lot of state actor stuff. That was one of the reasons why we did the project. It turned out that most of the rumors about election theft, most of the rumors about delegitimization, most of the stuff trying to suppress the vote came from the sitting President of the United States, which we wrote about. That's reality. I'm not going to sugarcoat it. Right. So we write about that. And as we go through this project, there's four different research centers that are participating in this, 120 undergraduate and graduate students that are the main analysts on this project. And we have a JIRA ticketing system. If you've ever called in with a, you know, customer service hotline, somebody like, makes a ticket for you. And that ticket kind of goes around the, you know, the organization, the building, and they like, you know, different people on your ticket. That's how we traced these things.
A
So what, what is the attempt that you're trying to map? What, what, what are you mapping?
C
Yeah, so we're tracking rumors as they go viral, and then we're trying to get them to people who can respond to them. So that might be the platform. Sometimes we would tag a platform in and say, hey, Facebook, hey, Twitter. You have this thing that's going viral on your platform. It violates your policy. Go have a look at it. Right, right. And then the platform, you'd see this in the little ticketing tags, they would say, thank you very much, we're looking. And about 60% of the time, actually, they would do nothing. 30% of the time, they would slap a label on it saying, this content is disputed. You know, Donald Trump would say something about mail in ballots being fraudulent. They would say, this content is disputed. Get the facts about mail in ballots, and they would link you out to an information site. About 10% of the time, they would take something down, they would decide that it rose to the threshold of actually actioning it with a takedown. So in the course of the full period of the election, we sent about 3,000 URLs in total. Right. So 3,000. That's actually very important, that number. So we also communicated with state and local election officials. They had access to our tip line. So a local election official in Kentucky, for example, sent in a tip saying, there is an account pretending to be an election worker. I don't know who this person is. They're claiming that they're destroying ballots. That was the kind of thing that we could then go and look at, see, hey, does this look like it's foreign? Does this look domestic? Is this something that a platform should be tagging just like a triage center?
A
You're like an election observer, but rather than existing kind of in the practical world, you're doing this virtually in the online world. And virtually. And trying to point out inconsistencies and things that may be troublesome to be investigated.
C
Correct.
A
Seems above board.
C
And so fast forward to. So we did this project, by the way. There was a 200 page report that sat on the Internet. After it was all done, we wrote about it at. And we made a table where after the election was over, we did a data poll on Twitter and we pulled in the total number of tweets of the most viral election rumors. Things that everybody has heard of Dominion, right? That Dominion machines were flipping votes. That there were Italian. These Italian space laser theories. Right. That the Sharpie markers in Arizona had changed ballots. So the top 10 most viral rumors that everybody saw, we added up the number of tweets. Tweets. It was 22 million tweets. Jim Jordan.
A
22 million.
C
22 million viral tweets. And this was the number that we put out there just showing the scope and the scale of how much stuff had been making the rounds on these very, very viral stories.
A
Right.
C
Jim Jordan.
A
Jim Jordan is a congressman from Ohio.
C
Very respected, extremely honest man.
A
Extremely honest. Legendary. I. There's not a piece of legislation that has passed in America in the last 20 years that does not bear that man's name. Game as a co sponsor. Couldn't have a grander reputation.
C
Sports coach. Sports coach. Really, really cares. Cares about the youth.
A
Some issues about his time as a wrestling coach that may be slightly under but. But has completely turned it around and is now a paragon of American sensibility and legislation. Continue.
C
Also happened to be an election denier.
A
Wait, what? Son of a.
C
So, all right, so then he goes. So the election deniers, you know, the House fl. Jim Jordan gets his gavel and starts this committee called the, I forget the proper name, which is shorthand, is the Weaponization Committee, but it's the subcommittee of the House Judiciary Committee to investigate the weaponization of the federal government. He decides there has been a Biden censorship regime. And even though the agencies that we engaged with during the 2020 election were run by Trump appointees, again run by Trump appointees. That despite the fact that we were talking to state and local election officials and occasionally when we did speak to federal government agencies, like when the Iran ran an influence operation, pretending to be the Proud boys. We did talk to the FBI about that, because our team saw that early on. We did speak to the FBI, Trump appointees, you know, but these are real. These are real. These are real things that are happening.
A
The Iranians literally tried to pretend that they were.
C
They actually did that. Yes. And, and, yes. And Trump was very upset about this, like, a week ago. That was one of the justifications, apparently, for why we just, you know, bombed Iran. But as we're doing all this work, as we're doing all this work, we're talking to dhs, CISA occasionally also. So as this is all happening, Jim Jordan gets his gavel two years later and accuses us of censoring 22 million tweets, that we were part of a vast plot by the Biden regime to steal the election by censoring 22 million tweets. So again, they claim that we. That that number that we added up after the fact, of the things that everybody saw, they claim was really the stuff that we censored.
A
They're saying that. That. And forgive me if this is. But I'm. I'm just going to try and think of their theory of the case. Case.
C
Yeah.
A
Is their theory of the case. And, and Casey, weigh in on this as well. Yeah. That by pointing that out that you are in league that you are intimidating these social media platforms to go through and cull things you consider misinformation or disinformation. And by doing so, you are unleveling the playing field is. Would that be their theory? And where does the 3,000 come in?
C
Well, the 3,000 would have been the thing to actually have that conversation about. The 22 million number was published in March of 2021. So long after. After January 6th even. Right. So long after, long after that would have had an impact. The way that platforms engage with researchers, which I think is worth the public understanding, is that platforms will reach out and they will periodically say, hey, we're considering doing a policy about this. What do you think? And then you can weigh in on that policy. As an academic researcher who works in a particular field. This is not a secret. Right. Twitter had a council of 60 different civil society organizations. We were not on that council, by the way, but Twitter had its civil society organizations council. And so whenever they were writing a policy about hate speech or about harassment or whatever their, you know, that issue that those councils dealt with, they would reach to those entities and they would say, hey, we're going to launch a new policy about this. You guys have an opportunity to provide Some feedback. And the reason for this is because back in 2016, 2015 timeframe, nobody on the outside was engaging with them at all. Right. All of their policies were developed entirely internally. And that didn't make people happy either, because then it was just one guy, basically the CEO of the company, Zuckerberg, Dorsey, whoever it was at the time, making that determination. And so the idea behind coming up with councils or reaching out to academics is that you have an opportunity to say your piece. And again, as I mentioned, just because you say something doesn't mean that they listen to you. Which is why when we published that report and we said we sent in 3000 tweets, we were absolutely transparent about this. It's, again, it's sitting up there on the Internet for two years. And we also said they ignored, they did not act on, on 60% of those 3,000 tweets. So what you see from that, again, is they did not feel pressured to do anything in response to what we were saying or what we were suggesting. They took things under advisement and they occasionally acted, but more often than not, they did nothing.
A
Only when it rose to a certain standard.
C
Right. And most of the time, they put a label on things. And that, I think, is also important to understand. So we were essentially scapegoated because Jim Jordan and the election deniers needed to come up with some justification after the fact for how the 2020 election was stolen.
A
Yes, the irony of investigating you for weaponization was weaponization. So that goes. But Casey.
B
Yeah, well, I mean, it absolutely was. You know, they, you know, they wound up shutting down the center at Stanford where Renee worked, or at least, you know, prevented them from doing the kind of research that they were doing. You know, they're filing, like, lawsuits against undergraduates who, like, dared to study what was happening during the 2020 election. So the weaponization was truly coming from inside the House. But, but also, like, I, I really do not want to give these guys too much credit and say, like, you know, there, there was some, like, principle that they, they had to defend. Like, if you've watched these hearings, it truly is just about creating a spectacle and manufacture and manufacturing this sense of grievance that will then enable Republicans to take further steps to disenfranchise American voters. Like, it. It really is that simple.
A
Well, let. Let's see if, though, if we can play devil's advocate and try and figure out, in the interest of fairness, what is the glimmer of truth within whatever it is that they're using to do the weaponization so let's go back to. Yeah, I don't think you would say that the culture of these social media platforms had a liberal slant to it. I, I think we all probably agree it did. If you think about. About Facebook and Twitter and those companies, they are steeped in probably at least an aesthetic amongst the workers that leans may be liberal. Would that be fair?
B
I think so.
A
Or did.
B
I mean, I think like the most liberal that they got.
A
Yeah.
B
Was like, if you looked at the content policies they had, they were like steeped in a tradition of human rights. You know, like they believed that hate speech was bad and that you should try to stop people from seeing that if they were part of a protected group.
A
But also in. In part of. In the culture when much more people were on the ramparts about the usage of certain words or various things. I, I'm just trying to get at like, the psychology of where this is so Elon. They recognize that these are powerful tools. So we're going to walk back a little bit just to get to kind of the. The genesis of this. Mark Zuckerberg does his zucker bucks, spends $400 million, ostensibly to beef up resources. This is during COVID So maybe they're putting up plexiglass on things. They're getting people more access, but he has the misfortune of spending $400 million on an election Donald Trump lost. Right, right. So that also becomes part of the narrative. So I'm just trying to walk through so that the culture is, maybe you consider it liberal. Zuckerberg spends all this money. He doesn't do it ideologically. Combine that with then Musk, who is having, during COVID an ideological rebirth, getting in touch with his South African roots, if you will. And we get into this idea of the Twitter files. He buys it because he's so disgusted by their censorship. And, and, and to be fair, during COVID there was information that the government put pressure on these social media groups to remove, and that information they asked to be removed did not necessarily turn out to be wrong.
B
Yeah.
A
Would that be fair?
B
Yeah, there was definitely pressure and, and just sort of like, you know, things the government said related to like, you know, masking and how is the virus transmitted. There were things that the government said that turned out not to be true.
A
True.
C
I think it's also one thing I'll note on that, on that front, there's the reality, and then again, there's the exaggeration.
A
Yes. I'm trying to get to the reality as well, but, but go ahead.
C
Yeah, yeah. So there is, there is. So there's a court case that you're possibly familiar with. Right. The Murthy v. Missouri case, Missouri v. Biden, where you see this litigated and it goes all the way up to the Supreme Court.
A
Explain very briefly just the, the genesis of the case and. Yeah, the background.
C
So two election deniers, the attorneys general of Missouri and Louisiana filed. No, I think it's really important. Again, no, I know. Motivation.
A
It's just so wild to like two election deniers, the attorney generals of Missouri,
C
Louisiana, one of whom is now the sitting senator of Missouri, Eric Schmidt. Right. So let's, let's, let's be, look, I think it's really, again, I think getting at the motivation is something that I feel like mainstream media dropped the ball on, candidly. And I'm going to be angry about that for a long time.
A
Understood. As you should be.
C
But so they file this lawsuit alleging that again, there is a Biden censorship regime which somehow started in the Trump administration. But holding that aside, that they eventually stopped focusing on the election because they have to deal with the inconvenient reality that at the time, these appointees were Trump appointees. So what they do to get around that inconvenient reality is they allege the deep state. Right. The unfalsifiable claim of the deep state. If you worked there at the time and something went inconveniently for Trump, it was the deep state. State. But then holding that aside, we're just going to jump ahead into the future. And now it's Biden during COVID And so you do see, again, you do see the government reaching out and communicating with the platforms. Now, the government has First Amendment rights and the government communicates with the platform as well.
A
Now, very clearly, did the government under Trump also reach out to the.
C
Yes. In other words, still is.
A
Is that something that happened 100% across the board. Exactly. So that's an important thing that to remember. Yeah.
C
And they're still doing it. Right? They're still reaching out today complaining about platform moderation of ICE related content. Right. So again, platforms and governments have had back and forth this tension for, since, since platforms have existed. Right.
A
And not to put too fine a point on it is, but in all the complaints about Zuckerberg being intimidated by Biden, Donald Trump threatened to jail Mark Zuckerberg if he ever did anything like that again. So if we're ever going to be talking about, about government intervention and intimidation to a social platform, let's be fair that it's one thing for the government to reach out. It's another thing for the President of the United States to say, and I will put you in jail.
C
Yeah.
B
Imagine if Joe Biden had said, I'm going to jail, Elon Musk, if I lose the election. Right. Like, you know, conservatives would have lost their minds.
C
So. And in 2018, you also saw threats by Trump in an executive order even to try to revoke platform liability protection. Right. So platforms have liability protection.
A
That's that rule two.
C
30 CDA. 230. Yeah.
A
230. Yes, yes, yes.
C
So what happens in this lawsuit is that the attorneys general of Missouri and Louisiana sue with a judge. I'm currently being sued in front of that judge. So.
A
What? Why?
C
Well, Stephen Miller sued me in front of that judge. We can talk about that after. No, yes.
A
Wait.
C
Because this is a machine. You understand? Ghoul.
A
Stephen Miller. That Stephen Miller.
C
Yeah, yeah, yeah. Yes, yes.
A
Dead eye Stephen Miller.
C
Yes.
A
Stephen Miller that walks by plants and they die. That's Stephen Miller.
C
Yes, it's an honor, but, yes.
A
Wow.
C
Wow.
A
All right, we'll get into that later.
C
Yes. So he. Don't let me forget that they pick this judge, Right? They pick this judge and. And they file this lawsuit, and they allege that the Biden administration did what's called jawboning. So, again, because the government has First Amendment rights, the government can communicate with the platforms. The question is, does rise to the level of coercion? Does it rise to the level of the government saying, for example, nice platform you've got there. Shame if anything happened to it. If you don't do this, or if
A
I'd have to jail you.
C
Hypothetically. Hypothetically.
A
Hypothetically.
C
And so they. So they. They do all these depositions, right? And they're deposing Foushee. They're deposing, you know, FBI agents. They're deposing. They're deposing Department of Homeland Security agents because what they're trying to. To find is evidence that the government was secretly demanding that platforms take down content. And what they see, what they do encounter are they have these emails from, example, Rob Flaherty, the White House digital director, where he is sending emails saying, like, what the hell happened here? You need to explain yourself. If you actually dig in, a lot of those emails that become very notorious are Rob Flaherty asking about the White House. House's own Instagram account. So, again, you have the grain of truth where the White House is occasionally communicating with the platforms using strong language. But the stuff that they really blow up the stuff that they really make, you know, these huge media moments.
A
This is the Twitter files and the whole thing.
C
Right. If you actually kind of delve down into it, what you find is that it's like the Twitter files, literally, they were taking emails and cutting them in half and pretending the top half of the email said something. It didn't. Didn't. So it's just the most incredibly dishonest misrepresentation of the actual evidence as they walk through these. These cases. And this is reflected then in the Supreme Court finding, which is that the judge that they kind of cherry pick in Louisiana says this is the biggest censorship effort the world has ever seen. Issues an injunction, actually issues an injunction. You know, this very broad spectrum injunction saying the government can't possibly talk to platforms. This becomes a problem. The fifth Circuit Court of Appeals, which is very conservative, actually walks back that injunction, which is a remarkable thing to see. It eventually makes it up to scotus. Amy Coney Barrett writes the opinion, and she says there are clearly erroneous findings by the lower court. The evidence just doesn't stand up here. And she tosses it for standing, because what she says is that none of the plaintiffs in the case, for example, J. Bacharya. Right. Who is now the. What is Jesus. CDC head at the moment.
A
CDC head. Yes.
C
It's like musical chairs with these guys and the health officials these days. But so the. So NIH head, CDC head, whatever his current role is both. He is, then he is. He, you know, he accuses the administration of censoring him, but there's not a single email in which the White House so much as mentions him. And so Amy Coney Barrett tosses this back, tosses this back, tosses it for standing and says the lower courts have these erroneous findings. And, and. And so it's basically kind of, you know, kicked back down.
A
But that's the kind of thing you get from these soy latte drink injustices. This Amy Coney Barrett, if I see her anymore with the little kitten ear hat and the resist signs, I'll lose my mind. Casey, this points to a really interesting dichotomy. Yeah. The difference between the court of social media media and the court. Yeah. And you find often that a lot of these weaponized complaints and all these things don't bear. Let's. Let's go back to the 2020 election. None of their complaints withstood the scrutiny of courts, withstood the scrutiny of bodies that have evidentiary standards. But in many respects, that's not really what Matters here, is it?
B
No. I mean, when you are just a, you know, browsing your social media feed, your evidentiary standard is this is. Does this feel true to me? Does this justify whatever I thought before I opened the feed? If so, I'm going to share it. And I think what's really scary about that is that the particular kinds of things that we're talking about today are being used as pretext to disenfranchise American voters. Right. Like, that's the ball game. Do we get to pick our leaders or not?
A
We'll talk about that for a moment. Because the idea is. So this is not benign. No. The idea of just saying like, oh, people need to present id on its surface sounds wildly reasonable, but underneath it are a lot of issues. Like women who did not change their, you know, maiden name to their married name would have to then somehow find their birth certificate and go get a passport. Like, there's a lot of. Of hoops to this.
B
Yeah, I mean, absolutely. I saw one study that said that there may be as many as 69 million women who took their spouse's name and don't have a birth certificate matching their current legal name. You know, women may be likelier, on balance, to vote for Democrats than Republicans. And so if you're a Republican and you're pushing this like, you probably don't care that married women may be less likely to vote in the next election. Right. We also know this is probably going to affect a lot of trans voters. If you're Republican, you'd be happy if probably if no trans voters voted in the next election. Right. So when you look at the groups that are affected here, it is just generally people that Republicans could stand to live without ever voting again.
A
And let's just to put a fine point on it, the reason why we're talking about this today is the Republicans are considering blowing up the filibuster, which I really don't give that much of a about to begin with, and doing all kinds of things to pass this act that's going to raise identification standards so that only American citizens vote, which does not not appear to be a problem of, of any substance, while protecting the actual mechanism that seems to be distorting American democracy. And I want to get into. And Renee, I, I'll ask you this because one, you could say, like, well, what's the game for them? The game is, let's look at Elon Musk's network worth by creating this algorithm on this platform, by donating $350 million to Donald Trump And Republicans, his net worth has skyrocketed. And the AI tech guys, they've all benefited in a wildly disproportionate manner through their coziness to this administration. Would that be a fair statement?
C
That is a fair statement. It's about, it's about maintaining power. Right. One of the things with social media is that they are, and they're tools of reach, they're tools of persuasion, they're tools for organizing and gathering and activating. And when you have the capacity, particularly with something like Twitter, which is very, very good at activation, you are, you know, controlling an incredibly potent infrastructure. One of the things that happened after Elon bought Twitter is that you saw influential accounts on the left leave, right? And so there's been this fragmentation to a bunch of different platforms. You've got Bluesky, you've got threads, but there hasn't really been any kind of cohesion.
A
There's no real competitor even as of now.
C
Yeah, there is no competitor in that regard, particularly for things like breaking news or shaping information in the moment. Their Instagram is great. You can grow very large accounts, very large reach. There's a lot of political influencers on threads who are reaching left leaning audiences. But it is not the same type of algorithm. It is just not the same structural function in the political discourse. And that is the thing that is, that is significantly different.
A
And in terms of, I mean, and we're talking about every one of these guys and that's the other thing. It's not just the algorithm that skews, you know, our democracy, it's the money. And since Citizens United, I've just got a little list here. Elon Musk donated 250 plus million, right, and has gained 234 billion. Bezos paid $40 million for a Melania documentary and another 40 million probably in, you know, advertising and everything else. Cut down on the Washington Post, he's up, you know, $15 billion. Zuckerberg, whose Zuckerbucks was, you know, so crucial to stealing the election, did an investment pledge of, you know, and even said, you saw the meeting when he said to Donald Trump, how much should I say I'm giving? Yeah, all these guys are, are, are they mercenaries? Are they just cozying up to an administration or are they ideological brethren now with them?
B
So this is really important to talk about because I think the true ideology is capitalism, right? Like you go back into the 2010s, most of the people that you mentioned, with the exception of Musk, they were essentially good liberals. Although of course, you know, Musk sort of had his. His flirtations with liberal causes as well.
A
And shouldn't we have complained about them then, though? Shouldn't we have just because we thought their aesthetic and their mentality was that. Shouldn't we have been complaining about their algorithms and their influence and their money then?
B
I think, like, going back, there were a lot of complaints about algorithms, that these places were increasingly becoming these centralized, like, centers of speech that did not have a lot of Democratic oversight or control. Like, Mark Zuckerberg has total control over matter. Even his own board doesn't get a say. Right, right. And so all these guys give a lot of money to Democratic presidents and Democratic causes, and in the end, they just don't get that much for it. Right. Joe Biden tries to break up Meta. He tries to break up Amazon. He creates various regulatory problems for Elon Musk. And at the end of the day, these guys are transactional.
A
Right. They hire Lina Khan.
B
Yes. The audacity of that. Right. But then Trump comes along, and you can just agree to build part of his ballroom, and he gives you whatever you want. That's a very recognizable character to a business person. Right. It's like we can just kind of strike a deal. So that is what you're seeing across American politics now is just a bunch of oligarchs who've grown impossibly rich and powerful, who are just able to buy what they want.
A
And protecting the terp. Where do you put Sam Altman in all this? He. He's another one that I. He just seems to be kind of this weird character that shape shifts for whatever the moment called. You know, he'll stand up and say, anthropic is doing the right thing, and then vacuum up their contracts when DOD cuts them loose.
C
He.
B
He is a shapeshifter. Like, when you talk to people who have worked with him, they will tell you that one of their biggest issues with him is that he is always telling you what you want to hear. It's why he's actually, like, quite charming in person. Politically, he has probably been a little bit more, like, liberally aligned. Like. Like me, he's a gay guy, and I think that's where his natural sympathies are. But if you ask him about Trump today, he's incredibly careful. You know, I asked him on stage about Trump last year, and he said, well, you know, I think he's really, really thoughtful about it. AI. It was news to me, John. But that's what. That's what he told me.
A
I'm a cereal Guy. But I gotta tell you, when you're a little older and not so easy to find, you know, it's not as cute when you're going through the, whatever they call them there, the stars, clovers and mushrooms and being like, oh, right. But My cholesterol is 187. It just saying cereal, not necessarily the best thing for you anymore. Except now Magic Spoon. Magic Spoon, it gives you that feeling. Saturday morning cereal while you get there. 13 grams of protein, 0 sugar, 5 grams of net carbs per serving, which is how I always chose my cereals when I was younger, I used to say to my mother growing up, how many, what's my, what's my net carbs here? 5 grams, 7 grams. What are we, what are we dealing with? But this stuff, Magic Spoon keeps you fueled. Whether it's breakfast, late night, snack, post workout, whatever it is. They got flavors too. It's not just one thing. You got fruity, frosted cocoa, cinnamon crunch, marshmallow, s', mores, all the stuff that you love. Magic Spoon. Look for Magic Spoon on Amazon or at your nearest grocery store. There are plant based versions of the cereal as well. Even vegan vegans could feel like they had a childhood. You'll find vegan options at Whole foods or get $5 off your next order at magicspoon.com TWS that's magicspoon.com TWS for $5 off. As you speak to these folks, their sense of Donald Trump, you know, know, look, Elon Musk said, and I asked him about this once, he said, I'm a free speech absolutist. So I said to him, so how do you support Donald Trump who clearly has said he wants to censor content he disagrees with? He threatens to throw Mark Zuckerberg in jail. And he said to me flat out, oh, that's just bluster. But now, you see, they're weaponizing that censorship for FCC approval and all kinds of kinds of other things. Is he just utterly full of shit? I mean, he himself has said, he himself has said, oh, that those people are treasonous and should be thrown in jail for saying things he disagrees with. So he's just utterly full of shit.
B
Yeah. This is a man who, when he took over Twitter, he started banning journalists because they put their Instagram bio in their, in their, like, Twitter bio. You know, he rewrote an algorithm to privilege his own speech over that of others. He banned people from Twitter for publishing the whereabouts of his private jet. Like, the list goes on and on. The guy has never cared about free speech, except insofar as that benefits him, his own speech. Yeah.
A
Renee, you were going to say something?
C
I was going to say, I think it's really important to understand the word censorship, not as something that free speech activists on the web have cared about this for a very, very long time. Right. I mean, we saw the freedom of speech, not freedom of reach thing that sits on top of his content moderation policy was something Aza Raskin and I came up with in 2018. Right. That, that argument that you should be able to maximize content, you should want as much to stay up as possible. And then at that point you think about, like, how do you, how do you decide what to curate? How do you decide what to preferentially amplify when you have a crisis like Covid? It is not bad for a platform to decide, hey, maybe we should, in response to search queries, have a little knowledge panel up at the top that returns something from the cdc. Right. Because people are looking for accurate information. They made perfectly reasonable decisions to uprank stuff. Did they take too much down? Yes, you can make the argument that they absolutely did.
A
No, I think that's fair. I think they did. I mean, and I think they would even maybe cop to that now.
C
I think that they would say that. I think that they would say that too. Again, when you get to certain types of content moderation policies, like the lab leak hypothesis that became, you know, such a thing. I thought that was a stupid policy. Right. I thought that. I thought that was a very dumb policy. But I also want to say, say. I also want to say, because I think people don't realize it, that was a meta only policy. Twitter didn't do that. YouTube didn't do that. Only meta had that policy. And they had it for three months. Right. So it sounds like a thing that was, you know, in place for two years and it actually wasn't. So when you actually look at. And I encourage anybody to do this because the one thing is the, the policy documents are there. You can go look at them right there. The, that, that aspect of the transparency thing is there. What you see from Elon, though, is, is that he borrows the moral weight of the word censorship while emptying it of moral content. And that's why I think Bars.
A
Bars. Come on, Renee, lay it down. That's exactly right.
C
No, but what killed me, Casey and I were on Kara's pod together when Elon started arguing that his AI had the right to neuter nudify children. Right. And that if you said that his AI didn't have the right to nudify children. I just want to say that again that you were censoring him, that that was an act of censorship.
A
And at the same time, he in Turkey just took down the opposition. Oh yeah. Completely under the guise of like, hey, that's the law that they have. So I just have to follow that.
B
Like, what are you gonna do?
A
Yeah, it's all nonsense.
C
So it becomes a shield, right? It becomes a mental stop word where the minute you say that word, people hear it and they say, stop thinking about what is it he's actually justifying with that word. He used it to justify the nudification of children. The nudification, the non consensual nudification of women.
A
Now that that's based on his AI or something, right? Yeah, AI model that did that.
C
Yes. And so if, if that kind of, you know, if we say that moderation of that kind of content is censorship, then that concept has just lost all meaning. Right. And that's where you can't, you can't cry free speech absolutism and, and make that claim in my opinion. Opinion.
A
I also want to get into. There's a distinction here, and I think it's a really important one. There's a difference between free speech and algorithmic speech. Algorithmic speech is ultra processed. And I, I generally do the distinction of, you know, Twitter speech is free speech in the way that Doritos are food. Like it's not really. It's processed, you know, that free speech isn't. We let everybody know when your tweet and we incentivize them to hostility and outrage and we monetize their ability to argue. And the algorithm is not free speech. It's just not. It puts it through an opaque process that elevates certain speech, speech that you deem more important or speech that your business model deems more monetarily beneficial. And so how do we draw the distinction between this idea of, of free speech and the algorithmic speech, which is a perversion of speech?
C
Well, I think it's also important for people to understand that on a social media platform, the First Amendment right is the platforms, it is not the users. It is the platform's First Amendment right to decide what it editorially curates. That's what algorithmic curation is. This has been reinforced legally over and over and over again. This is why, ironically, the conservatives lose, lose their must carry law cases. Right? That is why platforms are allowed to take things down. And on the flip side, it is why you Know, again, it allows platforms to leave everything up or take everything down. It allows them to pendulum swing in accordance with new leadership coming in. Right? Because the First Amendment right on a private platform belongs to the platform that is, to the company that is making the editorial curation decision. It doesn't belong to the user. And this is a thing that frustrates a lot of people. People. This is where you, you, you know, you hear the complaints that the platform is censoring me. In reality, the platform is deciding what to up rank, what to down rank and how to set the policies.
A
And by the way, if, if Elon Musk, if, if, you know, if somebody came after him for, you know, whatever it is, downvoting something or constricting that thing or making those decisions, the Republicans be the first one to say, hey, that's his, that's his platform, that's his First Amendment right.
C
Yeah. This was why they started to, to avoid regulation, right, to have these self regulatory mechanisms. That's what all those councils and you know, the periodic outreach to academics, the periodic outreach to government, that's what all of those things were. It was the, hey, you guys, you know, you can weigh in, you can give us some feedback so that it doesn't look like we're making these decisions unilaterally. And then in a way you could argue that for, you know, for a time I think that they were trying to be good. So citizens. Maybe, maybe I'm giving them too much credit.
A
But no, they're, they've been unleashed with what they consider animal spirits now, I would assume. Casey, what were you going to say?
B
You know, the thing that I just want people to remember whenever they're looking at a feed, whether it is X Instagram, anything that is ranked in this way, you are staring at something that has been engineered to hypnotize you. And what, you know what I mean? Like, and what hypnotizes you. Conflict, outrage, weird stuff, sexy stuff, stuff that's going to produce a really strong emotional response. So what I try to train myself to do, and I struggle with this too, is like when I see something online that makes me feel a very strong emotion, that is the moment that I'm trying to be the most skeptical. That is what I'm saying. Wait, who is posting this? Why are they posting it? What are they trying to get me to feel? And that is the kind of core tension that you're just always going to experience when you're looking at an apple like this.
A
So as we, as we sort of break it down now, you know, the Republican focus is on this so called Save America act, which is going to safeguard our elections. But, you know, our premise is that the real threat comes from this algorithmic manipulation of our speech combined with the unceasing amount of money that can be thrown into the pot by the, these new Gilded age whatever they are, you know, robber barons. How do we find our way? Is it, you know, if we take the analogy of algorithms to processed food, is there an ingredients, list of speech? How do we label that, you know, Community Notes? I think Community Notes does a very nice job. I, I, I actually agree with that, that y. But I, I also think Community Notes is still weaponized politically. I would like to see a Community Note for good faith and bad faith. I'd like to see some kind of good faith bad faith like the way they do in restaurants in New York City. If you see a C in the window, you are not eating their soup. What are some of the, are there tools that can be in this arena for a us?
B
I mean, to me, I, I think the, the sort of fast food analogy is a really good one. But like the solution to McDonald's is not like be really careful inside the McDonald's and always try to order the salad. It's don't go to McDonald's too much. And I think we need a slow food movement for the media. The good news is I think we already have one. I think podcasts are actually a pillar of this slow food movement. When you hear, you know, know three people talking about something for an hour, you're probably going to get a richer and more nuanced picture.
A
You really haven't listened to this podcast much, have you? Because that's, that is, that is not my mil. You.
B
Well, I think you're doing a lot of good here, John.
C
But, you know, but, but newsletters are
B
also part of this, right? Not every newsletter, but I think there's a lot of like really smart people that are just kind of like sharing their thoughts in this like very long form way. So I just think we need to find other strategies like that. The strategy is not what will make Instagram better because I just think that's like probably a losing game.
A
Oh, that's interesting. Renee, do you agree that that's, you know, the idea of kind of urging them to become better citizens is not going to be, bear any fruit?
C
I think people have been trying to do it for a decade now. I, I feel a little bit discouraged on that front too. You know, I have kids, my oldest is 12 and oh, you're right there, man.
A
You're on the Cusp.
C
No, no, we really are. I mean, it's not. I feel like the Cusp is like fourth grade now. It's like nine.
A
Even though he's not.
C
He's not on social media. But, you know, you see these kids, they realize that, like, they can turn Google Docs into a chat app because everybody's on a Chromebook. And then they can take a YouTube link, throw it into Google Docs, and it'll play embedded in Google Docs. And you're, like, fighting with your kid to not watch so far ahead of us. You're, like, trying to make your kid, like, maybe you should pay attention in math instead of watching some degenerate streamer. Right.
A
You know, with kids, you always feel like you're the Iranian regime and they're a VPN and they're just.
C
They just get around everything. But I mean, the reason I. The reason I was thinking about this as Casey was talking, though, is I just, you know, you try to emphasize, like, make good decisions. Right? The, you know, look, I'm not gonna. I can't. I can't. I can't keep him off it. Right? I can say, like, I'm not going to let you have a social media account. I understand you're going to watch YouTube. Let me explain to you where some, you of. Some of this stuff comes from, because a lot of. A lot of what's really interesting with. With middle schoolers is, like, meme culture for us in 2016 is so normalized for them. A lot of the stuff, even, like manosphere content, it's just so in the water at this point. It's just there.
A
Does that help it lose its. Its effect? In some respects, like, when it first comes out, it's novel. Like, look, this is a relatively new form, form of communication, and television and radio created disruption. Hell, the printing press created disruption. Everybody thinks, oh, the printing press happened and that ushered in the Enlightenment. It really didn't. It ushered in 200 years war.
C
Yeah.
A
Killing. Killing people and going, you know, burning witches. Are we in a period of adjustment where your kids won't be affected in the same way because it's native to them?
C
I think they. I think that they know that it's not good. I think that, you know, for a lot of them, them. They don't want social media. They're not looking for it. I used to hear this. We would have high school students over to Stanford, actually, a fair bit, and there would actually be a lot of High school students who are like, I just don't want to be on it. I just don't see the point. I just, I don't feel good about myself when I spend hours scrolling a feed. And so I don't do it anymore. And so you do hear a little bit of that, you know, and nothing is more humbling than having your five year old say, is that a no mom or is that a you're distracted on your phone? No. Right.
A
Yikes. How is it that they always see right through us? Is it. I hate that about them.
C
Right? Yeah. My kids are 12, nine and five, you know, and the five year old recognizes that sometimes she's getting a distraction response as opposed to an actual response. And they, and they will, they will absolutely say it to you because you know, they, they recognize what's happening and it's very humbling moment you realizing that you are just as addicted probably more so than they are. So I think that is it, is it normalized? I think I struggle with how do I I knowing what I know. I mean this is, you know, this is my job is to look at the worst stuff on the Internet. But how do you keep your kids
A
away from it or at least keep it from, from penetrating. But I think, and I want to ask you, you know, Casey, you talked about sort of that relating it to the slow food movement and, and I've seen things like that or farm to table and they always become New York Times style section columns, but they never actually become ubiquitous within the culture. And I wonder, are we thinking about it the wrong way? Because in some respects this is a battle. And what I see a lot on the left is we've got, that guy's given misinformation. So we've got to stop that or we've got to take that guy out. And I always view it very differently, which is. No, you fight information with information and you have to fight, fight it as tenaciously. You know, I've, I've been locked in battles with, you know, I remember when we were trying to do PACT act for it was a burn pit bill for veterans. You would think, you know, who's going to be against that. But there was a very strong group of weaponized right wing influencers who spread misinformation about that bill. And we could have gone the route of you've got to take this down, you've got to put a community know. But we did what we did is we went right at them as tenaciously as we could. And is there a model in that. You know, it's been shocking to me that it's malpractice in terms of social media companies that nobody has created a viable competitor yet for, for Twitter. Like, that blows my mind. Mind.
B
And, you know, people are trying, right? Like, Blue sky is trying, Threads is trying. I think these networks get really entrenched and they're difficult to disrupt. You know, what you just said, it made me think of, like, what Gavin Newsom is doing.
A
I thought that's been very effective.
B
Yeah. I mean, I think people have, like, like a lot of different feelings about it, but you cannot deny that the guy is like, down in the trenches and he's fighting the fight on the terms that Trump has created. And I do think it is worth, you know, someone doing that to kind of see what happens.
C
I, I wrote this article. I think it was, I wrote it, actually. I mean, candidly, it was after Stanford shut down the Internet Observatory. And I was pissed. But it was, it was an article basically saying, like, you have to fight. I mean, per the point. Right. It was. There was such a capitulation on that front. What happened when Stephen Miller sued us and all the subpoenas came down?
A
Oh, yeah, right. I forgot about that.
C
Let's.
A
And in all the hubbub, I forgot that the Undead has filed a lawsuit.
C
Yes.
A
Again, against Renee. Talk to us a little bit about what? Why?
C
Well, it's pending litigation, so I can't go into the details, but it's, I mean, I mean, the, the basics I can cover, which basically it's the same lawsuit. It alleges that we. So they found a plaintiff, somebody we'd never heard of, never, you know, never talked about, but she lives in that district. So that's how, that's how they get their judge. Right. But we, they alleged that in our communication with the platforms, when we said, for example, Gateway pundit wrote a false story or if we, you know, if we tagged. Tagged it. A Gateway pundit wrote a lot of false stories in the 2020 election. I'm just going to say that again,
A
kind of Gateway pundits thing.
C
That's, that's. Yeah, but as they said that, that was us acting as like, you know, basically de facto agents of the government. Right. That we were. They alleged that DHS was secretly puppet mastering us to do it. And so we were violating the civil rights of the people that we talked about or that we wrote about or that we flagged or that we talked to platforms about. I mean, this is going to get dismissed eventually. This is such A stupid theory that it makes no sense at all. But the point is to tie you up in legal bills for three years and to shut you up. Because again, like I said, I can't really go into the details of it.
A
Just out of curiosity though.
C
Yeah.
A
I guarantee you Stephen Miller and DHS are in touch with those platforms directly.
C
100%. Yes, absolutely they are. They don't even hide it.
A
So wouldn't that be somewhat of a defense?
C
Well, John, you're assuming that like hypocrisy matters. You have to have, I mean, again, like, like you said, right. Eventually the courts come down in the realm of reality. But in the meantime, the way that the, the way that the institutions, this is the point I'm trying to make. The way that the institutions decide whether to persist is in the court of public opinion. Did they feel pressure? Do they feel like you're going to constantly get, you know, are they going to be constantly sued? Are they going to be constantly harassed? Are people going to be constantly complaining about the university? Is there going to be reputational harm? Is there going to be reputational damage? And that is what ultimately makes a lot of these institutions decide. The best thing for them to do is to shut up and say nothing. It is the wrong approach. It has been the wrong approach for several years now. But making them understand that you have to fight in the arena is a foundationally different shift. Because they're hearing from their PR people who were trained in the 1990s era of crisis comms or comms, that you shut up and you let the media cycle move past and they don't understand that there's no such thing as like the media cycle in the age of social media when they can just, you know, kick it back up again again once they've made you a character in a cinematic universe and there's no such
A
thing as a frictionless existence. This idea that they think if they just make themselves small enough they can live in a frictionless existence is ridiculous. Because what you end up making yourself is a tasteless pablum so inoffensive that it serves no one and does nothing. Y Casey, maybe you have a sense of this. How is it that. And I, I, I don't think there's any question that this is a bioengineered product. Product that is designed to escape whatever it is that is self protective within the human brain to continue it is monetizing your attention and your life. They care about nothing. It is a parasite. While it can be viable and good, it is nefarious in its intent and monetized in its intent. How the hell is a product that powerful, that dangerous, go utterly unregulated for any of its effects? And how do these companies escape any liability?
B
It's a great question, and I think there has been a lot of progress lately, actually, in trying to shift that discussion. I would say that prior to 2020, we didn't really think of these products in the same way that you would think about other regulated goods like alcohol or tobacco. And people tried to pressure them to make different sort of content moderation decisions. And, well, maybe if the algorithm worked a little bit differently, we would. We would all be happy again. But you fast forward to today and what people are saying is, like, actually, your product is just broken. Actually, it's not safe for anyone to use if they're under 16. It's probably not safe for us either. But let's at least start with the children. And so you look around the world, and country after country is not now saying, we're actually just going to ban this stuff until you turn 16. Because again, while it's probably like very bad for adults, we are increasingly confident that it's bad for children. And so I just think that's like kind of the start. And, you know, maybe, like, maybe this is just cope. But part of me wants to think that in the same way that, like, banning children from smoking eventually led a lot of adults to stop smoking too. I wonder if we're not going to see something similar for social media.
A
Right. No, I think that's. That. That's quite possible. Is there a warning label, Renee, that we could possibly come up with? This is your brain on Facebook.
B
The Surgeon General advocated for that.
A
Oh, wow.
C
This was actually an idea that was. That was making the rounds for a while because, again, like, what are the. I feel like center for Humane Tech had some ideas around, like, switching your phone to grayscale so it was less appealing. A lot of these. I mean, I think there's actually studies that show that people, you know. Well, I think that there's actually some advocacy, scientific basis to that. Yeah, like, it's just less appealing.
A
Well, but, Renee, I switched my face to grayscale.
B
And.
A
And it's clearly less appealing. Well, guys, I very much appreciate you taking the time. This has been absolutely fascinating and really helpful in understanding it. Renee Dire, that's a associate research professor, George Hamil, Court School Public Policy, and also the author of Invisible Rulers, the People who Turn lies into Reality. And Casey Newton, he's got platformer and of course co host of Hard Fork. So guys, thank you so much for for being here.
C
Thank you for having us.
B
Thank you John. It's great to be here.
A
Man. I enjoy a nice expert panel. Those guys know their I must apologize to all of you are wonderful panel. Lauren Walker, Brittany Mevic, Jillian Spear will not be joining me today. We had what can only be described as a technical malfunction which as many of you know is also my nickname. So it we weren't quite able to to work it all through. I'm going to say the cloud even though I know I'm pulling that out of my ass. But really appreciate it. And they will obviously be back next week and I want to shout them out again because of their work giving me the information to allow me to have a cogent coherent conversation with people who are expert in their field. So lead producer Lauren Walker, producer Brittany Mic, producer Jillian Spear video editor and engineer Rob Votola who did Yeoman's work even getting this thing done. Audio editor and engineer Nicole Boyce there I I don't know, I think they're pulling all nighters just to get this thing out by Wednesday. And our executive producers Chris McShane and Katie Gray. We will see you guys next week. Bye bye. The weekly show with Jon Stewart is a Comedy Central podcast. It's produced by Paramount Audio and Busboy Productions. Foreign. Reynolds here from Mint Mobile. I don't know if you knew this but anyone can get the same premium wireless for 15amonth plan that I've been enjoying. It's not just for celebrities. So do like I did and have one of your assistant's assistants switch you to Mint Mobile today. I'm told it's super easy to do.
C
@mintmobile.com Switch upfront payment of $45 for 3 month plan equivalent to $15 per month Required intro rate first 3 months only, then full price plan options available, taxes and fees extra.
B
See full terms@mintmobile.com@blinds.com it's not just about window treatments. It's about you, your style, your space, your way. Whether you DIY or want the pros to handle it all, you'll have the
A
confidence of knowing it's done right. From free expert design help to our
B
1 100% satisfaction guarantee, everything we do is made to fit your life and your windows. Because@blinds.com the only thing we treat better
A
than windows is you.
B
Visit blinds.com now for up to 45% off with minimum purchase plus a professional measure at no cost. Rules and restrictions apply.
C
Paramount podcasts.
Original Date: March 18, 2026
Jon Stewart hosts Casey Newton and Renée DiResta to examine what they identify as the "real" threat to American democracy: not mythical widespread non-citizen voting, but instead the unchecked power of social media algorithms and disinformation. The trio explores how viral rumors, politicized platforms, and billionaire influence shape—and potentially warp—public perception and electoral outcomes. They also dive into policy battles like the SAVE (Save America) Act, the weaponization of misinformation, and the precarious relationship between tech giants and government over content moderation.
[03:03]
[06:26]
[11:42]
[16:20]
[19:44]
[22:16]
[34:46]
[42:27]
[45:12]
[55:31]
[58:27], [60:29]
[67:11]
[70:53]
Guests:
Host:
“When you are just browsing your social media feed, your evidentiary standard is: does this feel true to me?... The particular kinds of things that we’re talking about today are being used as pretext to disenfranchise American voters. That’s the ball game: Do we get to pick our leaders or not?”
— Casey Newton [42:27]