Loading summary
Ryan Knudsen
So are you aware that there's like this trend at the new year where people come up with lists for what's in and what's out?
Jeff Horwitz
I am aware of this. It's a very sad phenomenon.
Ryan Knudsen
Ryan, you really, you don't have an in and out list for yourself?
Jeff Horwitz
I don't have an in and out list for myself.
Ryan Knudsen
Well, if you were to come up with one for, let's say, Facebook, what would be on its in list and on its out list?
Jeff Horwitz
I mean, in would be getting along with the new administration and out would be content moderation.
Ryan Knudsen
Our colleague Jeff Horwitz covers Meta, Facebook's parent company. For nearly a decade, Meta and many other social media companies have taken an active role in policing content on their platforms, taking down posts with hate speech or sticking fact checking labels on viral content. But now for Meta, that era is over.
Mark Zuckerberg
Hey everyone, I want to talk about something important today because it's time to get back to our roots around free expression on Facebook and Instagram.
Ryan Knudsen
Earlier this week, Meta CEO Mark Zuckerberg posted a video announcing big changes to how Facebook, Instagram and threads will moderate content.
Mark Zuckerberg
The bottom line is that after years of having our content moderation work focus primarily on removing content, it is time to focus on reducing mistakes, simplifying our systems, and getting back to our roots about giving people voice.
Jeff Horwitz
So it's sort of this like wholesale departure rejection of the idea that the platform should even govern itself, rather than like a question of is it doing a good enough job. It's like, nah, this isn't our job.
Ryan Knudsen
Welcome to the Journal, our show about money, business and power. I'm Ryan knudsen. It's Thursday, January 9th. Coming up on the show, Meta's major retreat on content moderation. Foreign.
Alexa Kors
This episode is brought to you by Mint Mobile. If saving is your goal for 2025, switch to mint Mobile. They let you maximize your savings with plans that start at $15 a month when you buy a three month plan. To get this new customer offer, go to mintmobile.comjournal tap the banner to learn more. $45 upfront payment required equivalent to $15 a month. New customers on first three month plan only speed slower above 40 gigabytes on unlimited plan. Additional taxes, fees and rest. C Mint Mobile for details.
Ryan Knudsen
Let's go back in time for a minute. Can you tell me the story of where Facebook's content moderation system came from?
Jeff Horwitz
I mean, in any, like, truly substantive sense. It was post 2016. So there was this succession of scandals and crises from fake news in the United States.
Megan Bobrowski
Facebook now under fire. Critics say it allowed fake news to.
Jeff Horwitz
Spread on the platform, potentially to Russian election interference.
Ryan Knudsen
ABC News has obtained some of the posts and ads congressional investigators say the Russians planted on Facebook as part of.
Jeff Horwitz
The Kremlin to the genocide in Myanmar.
Megan Bobrowski
Facebook has been accused by UN investigators and human rights groups of facilitating violence against the minority Rohingya Muslims in Burma.
Jeff Horwitz
All of these things were sort of pointing in the direction of the company needs to do more. And for a few years, the company really did.
Ryan Knudsen
Initially, Zuckerberg was reluctant to moderate content beyond what was legally required. In November 2016, he said, quote, we must be extremely cautious about becoming arbiters of truth ourselves. But there was so much pressure on Facebook from lawmakers, legacy media outlets, and advertisers that Zuckerberg eventually gave in. And the company spent billions of dollars building out a massive content moderation machine. At the peak of Facebook's content moderation system, like when it was doing the most. What did that look like? What did it actually do?
Jeff Horwitz
A few things. There was first, just an army of human moderators, mostly outsourced. Then there was efforts to improve the quality of news on the platform, like to sort of weight the system in favor of outlets that users said they trusted rather than like kind of random fly by night headlines. And there were also automated systems sort of scouring the platform for things that were violations of the platform's rules and demoting them, if not outright removing them.
Ryan Knudsen
In addition to those outsourced moderators who reviewed user posts for things like hate speech, Facebook also started using fact checkers to identify and label false or misleading articles.
Jeff Horwitz
Fact checkers were in some ways the beachhead for all of this stuff. And ironically, I think it was one of the things that they felt was the least controversial originally, which is that they were going to contract with respected, known entities that adhered to journalistic standards, had accreditation.
Ryan Knudsen
News organizations like the Associated Press.
Jeff Horwitz
Yeah, like the Associated Press, like PolitiFact, Reuters, I believe was for a while in there, Snopes. So these were all entities that were basically supposed to just find viral content that was false and flag it. And that would serve two purposes. One is it would notify users who saw the post that there were at the very least, some serious reservations about its accuracy. And the second thing is that it was going to get used by Facebook to sort of slow down the spread of things. Didn't mean it would go away or they'd take it down. It just meant that they weren't going to actively promote it. And so this was, I think, a big part of the company's efforts to deal with fake news on its platform was this initiative.
Ryan Knudsen
How effective was this system? Did it seem to actually work to slow down the spread of misinformation and hate speech on Facebook?
Jeff Horwitz
There's two ways to answer that. Right. The first one is that no, it never worked. Great. The second answer is that it was all the platform had and it was therefore invaluable. Fact checks were always slower than virality. By the time a fact checker got around to publishing something saying, this is false, here are the citations. Usually that thing had gotten most of the traffic it was gonna get, Right?
Ryan Knudsen
The lie gets halfway around the world before the truth puts its pants on. Right?
Jeff Horwitz
Yeah.
Ryan Knudsen
And that also applied to Facebook.
Jeff Horwitz
And the lie travels a lot faster when it's running on a social media platform's recommendation system. Right.
Ryan Knudsen
But what you're saying is it was still better than nothing. It still did.
Jeff Horwitz
This was the core of the defense. Yeah.
Ryan Knudsen
The other issue with Meta's content moderation systems was that the rules around what people could and couldn't say became very detailed and nuanced. And the company struggled to make rules that prevented hate speech but still allowed users to express themselves. For instance, Facebook's rules didn't let users say things like, I hate women, but such a comment might be okay if it was referring to a romantic breakup.
Jeff Horwitz
Adjudicating this stuff was, like, really difficult. And also, whatever a human might decide about things like that, trying to get an algorithm to make a similar decision is like. Was like, nearly impossible. So it was this, I think, a very frustrating process for, particularly for a bunch of very tech minded people, because these were not problems that lent themselves to, like, just building a solution, a technical solution, and like, setting it loose and moving on to the next problem. They were like kind of chronic societal context based. All the things that tech does not succeed at, if that makes sense.
Ryan Knudsen
Inevitably, the system made a lot of mistakes. Many people had posts taken down or suppressed that actually didn't violate the rules, and it frustrated users. Even Zuckerberg himself got frustrated, according to people familiar with the matter. In late 2023, he posted about tearing his ACL and got far less engagement than he expected because the platform deprioritized viral posts about health. Conservatives especially felt like Facebook's content moderation systems unfairly targeted them.
Jeff Horwitz
There's the question of, like, why are Christian posts about abortion getting taken down by fact checkers? Why are so many conservative news outlets being penalized for false news why wasn't I allowed to say this particular word about that particular group when I can say it on the street and it's perfectly legal? And so there was a lot of concern that Facebook was over enforcing on the right. And every time there was a moderation mistake, and a lot of them happen, you know, there was the implication that this was Facebook acting to harm conservatives.
Megan Bobrowski
These social media outlets censor public figures who are conservatives.
Jeff Horwitz
Facebook had purposely and routinely suppressed conservative stories from trending news.
Megan Bobrowski
If you share my podcast on Facebook, I got hundreds of emails from people who said that this post has been marked as spam.
Jeff Horwitz
People try to type my name into the search box on Instagram or the Discover box. My account does not show up.
Megan Bobrowski
It's relentless. I'm thinking about quitting Facebook.
Ryan Knudsen
So over the years, Facebook made lots of tweaks to its content moderation system. For example, in 2019, it said it would stop fact checking political ads. But overall, the system stayed in place. How did Mark Zuckerberg seem to feel about all these efforts?
Jeff Horwitz
Mark never loved this stuff. Mark, I think, was always deeply skeptical of human intervention of any variety in the system.
Ryan Knudsen
And then Donald Trump's victory in the 2024 election gave Zuckerberg an opportunity to make a big change.
Jeff Horwitz
I think it's kind of part of an overall, I would say, shift in Silicon Valley in which dominant companies have all been making extremely nice with the incoming administration. And obviously, Donald Trump had made clear that he very much cared about whether Facebook was stacked against him and that he did not trust the company and its liberal employee base to be fair, and that, you know, he might go after Mark Zuckerberg personally if Mark Zuckerberg didn't oblige.
Megan Bobrowski
Trump writes in the caption that Zuckerberg plotted against him during his reelection campaign and warns Zuckerberg, quote, if he does anything illegal this time, he will spend the rest of his life in prison.
Jeff Horwitz
Again, I don't know that anyone had to twist Mark Zuckerberg's arm about this. He was personally aggrieved by it. It was costly and expensive, and he never had an interest in doing it in the first place. So what about the content moderation system?
Ryan Knudsen
Was there to like what these new changes mean for Meta's platforms? That's next. Taxi.
Megan Bobrowski
Imagine hailing a cab with no one in the driver's seat. Welcome.
Jeff Horwitz
Please buckle your seatbelt and enjoy the ride.
Megan Bobrowski
Self driving car company Waymo has spent billions developing its tech. What's changed is machine learning.
Mark Zuckerberg
I'm not really thinking about who's driving.
Megan Bobrowski
But will this big bet pay off for Waymo and its parent, Google owner Alphabet? Find out in Waymo and the Robo Taxi Race, a new series in the WSJ's Future of Everything feed.
Mark Zuckerberg
I started building social media to give people a voice.
Ryan Knudsen
In his video earlier this week, Zuckerberg said that politics was a factor in the company's decision to dial back on content moderation.
Mark Zuckerberg
The recent elections also feel like a cultural tipping point towards once again prioritizing speech. So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms.
Jeff Horwitz
And he is explaining that the company spent a number of years trying in good faith to placate its critics in the legacy media over things like fake news, et cetera, and that it did its best, but that it really went too far.
Mark Zuckerberg
We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes. Even if they accidentally censor just 1% of posts. That's millions of people. And we've reached a point where it's just too many mistakes and too much censorship.
Ryan Knudsen
Zuckerberg said Facebook would end its fact checking program and replace it with Community Notes, a user generated system similar to what X does.
Mark Zuckerberg
We're going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and gender that are just out of touch with mainstream discourse. What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas and it's gone too far.
Jeff Horwitz
There are some very specific changes to rules that like, are hard to read as anything other than the company trying to get out of the way of future controversies. So for example, like the question on whether you can call transgender people it like that is specifically a carve out that you can now do on. It doesn't violate the hate speech rules on Facebook anymore, but it used to.
Ryan Knudsen
Previously it did.
Jeff Horwitz
Yeah, it used to. Likewise, you can say that homosexuality is a mental illness that previously would have been a violation. Likewise, you can liken women to household objects and liken them to personal property, right? And like you could see where some of the cultural fights are that the company is like very directly responding to in those specific rule changes. And the specific rule changes are like the least important part of all of this. But I think they're a pretty good indication of the idea that the company is very happy to sort of bend not just its moderation structure but also the rules themselves to this new era, as Mark Zuckerberg referred to it.
Ryan Knudsen
Meta isn't getting rid of content moderation entirely. There are still rules that prohibit what Zuckerberg called, quote, legitimately bad stuff, including terrorism and child exploitation. Zuckerberg also said that the team that reviews US Content will be relocated from California to Texas because Zuckerberg said there was less of a concern that the employees there would be biased. So what do all these changes at Meta say about just how difficult it is to moderate content on the Internet?
Jeff Horwitz
I mean, it's obviously difficult. And these are not, you know, you don't solve misinformation. Right. You know, trying to like, somehow make the Internet always factual is like, clearly not going to work out. But there's the question of do you keep trying at this stuff? Does the platform take some responsibility for efforts to mislead and try to address those, or is that just kind of not in scope? So I think this is less of a question of the feasibility of doing it than it is a question of whether anyone wants to do it at all.
Ryan Knudsen
I see. And so for Zuckerberg at Meta, it seems like he's decided that this is something that he doesn't really want to do.
Jeff Horwitz
Exactly. Cutting out, fact checking, getting away from content moderation, prioritizing, not making false moderation calls, overdoing effective moderation in the first place. These are things that Mark actively wanted to do. Honestly, from the first Trump term, this wasn't a new sudden desire or willingness. It was that I think the circumstances were such that it was time to make this happen.
Ryan Knudsen
Where do you think Facebook is going to go from here? Do you think this is the end of the story when it comes to Facebook's content moderation efforts, that this will be the regime that will last long into the future? Or is it possible that this could all change again in a few years?
Jeff Horwitz
Of course this will all change again. At some point. The EU is going to have strict rules. So how much of this stuff applies to the EU is kind of still to be determined, like whether getting rid of fact checkers is something that the EU will tolerate. And I think something else that is a question is what the user experience is. I think something that a lot of people who worked at Facebook on integrity issues have long believed is that as much as the company resented basically having this sort of bunch of safety minded nerds telling them you shouldn't do that, that the safety work was essential to making the platform like a livable place, not necessarily always clean and well lit and, you know, perfectly civil but livable. And I think that it will be interesting to see to what degree they were correct and to what degree, you know, this was all just sort of vanity and ineffectual. I think we'll certainly see what Facebook and Instagram look like in this new era as well and, you know, see how far it goes and what that means for users.
Ryan Knudsen
That's all for today. Thursday, January 9th. The journal is a co production of Spotify and the Wall Street Journal. Additional reporting in this episode by Alexa Kors and Megan Bobrowski. Thanks for listening. See you tomorrow.
Podcast Summary: The End of Facebook’s Content Moderation Era
Episode: The End of Facebook’s Content Moderation Era
Release Date: January 9, 2025
Hosts: Kate Linebaugh, Ryan Knutson, Jessica Mendoza
Produced by: The Wall Street Journal & Gimlet
Co-Production: Spotify and The Wall Street Journal
In the January 9, 2025 episode of The Journal, hosted by Ryan Knutsen alongside Jeff Horwitz and Megan Bobrowski, the discussion centered on a pivotal shift in Meta Platforms Inc. (formerly Facebook Inc.) regarding its longstanding content moderation practices. This episode delves into Meta's decision to scale back its content policing mechanisms, exploring the motivations, implications, and future prospects of this significant change.
Jeff Horwitz provides a historical context, explaining that Facebook's robust content moderation system emerged post-2016 in response to a series of global crises. These included the proliferation of fake news, Russian election interference, and the spread of hate speech contributing to ethnic violence, such as the genocide against the Rohingya Muslims in Myanmar.
"All of these things were sort of pointing in the direction of the company needs to do more. And for a few years, the company really did." (03:09)
Initially, Facebook's CEO, Mark Zuckerberg, was hesitant to engage in extensive content moderation. However, mounting pressure from lawmakers, media outlets, and advertisers led to significant investments in building a comprehensive moderation infrastructure. At its peak, Facebook employed a combination of outsourced human moderators, automated systems, and partnerships with reputable fact-checking organizations like the Associated Press and Reuters to identify and mitigate misleading or harmful content.
In a pivotal announcement on January 2025, Mark Zuckerberg signaled a dramatic overhaul of Meta's content moderation strategy.
"The bottom line is that after years of having our content moderation work focus primarily on removing content, it is time to focus on reducing mistakes, simplifying our systems, and getting back to our roots about giving people voice." (01:23)
This shift entails ending the previous fact-checking programs and introducing a user-generated system called Community Notes, akin to mechanisms employed by platforms like X (formerly Twitter). Additionally, Meta is relaxing several content policies, particularly those surrounding sensitive topics such as immigration and gender, which Zuckerberg argues have become "out of touch with mainstream discourse."
"We're going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and gender that are just out of touch with mainstream discourse." (13:48)
The transition away from stringent content moderation has not been without controversy. Jeff Horwitz highlights that this move has led to numerous unintended consequences, including the suppression of conservative voices and allegations of bias against right-leaning users.
"There's the question of, like, why are Christian posts about abortion getting taken down by fact checkers? Why are so many conservative news outlets being penalized for false news..." (09:09)
Conservatives and other groups have voiced strong opposition, claiming that Meta's moderation systems were selectively targeting their content. Instances cited include posts being marked as spam, reduced visibility for certain profiles, and broader claims that Facebook was overstepping by enforcing a liberal agenda.
"If you share my podcast on Facebook, I got hundreds of emails from people who said that this post has been marked as spam." (09:50)
"It's relentless. I'm thinking about quitting Facebook." (10:05)
Zuckerberg attributes the pivot away from heavy content moderation to a combination of political pressure and a desire to return to the platform's foundational principles of free expression. He argues that the existing moderation systems were overly complex and prone to errors, leading to excessive censorship that alienated a significant user base.
"We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes. Even if they accidentally censor just 1% of posts. That's millions of people." (13:21)
Furthermore, the relocation of the U.S. content review team from California to Texas reflects Meta's intent to minimize perceived biases within its moderation workforce.
"We've reached a point where it's just too many mistakes and too much censorship." (13:21)
The episode explores the broader ramifications of Meta's shift. Jeff Horwitz notes that while the company is retreating from active content moderation, fundamental issues like misinformation and hate speech persist online. He posits that Meta's decision may be temporary, influenced by the political climate, and could face significant challenges, especially with impending regulations from entities like the European Union.
"Of course this will all change again. At some point. The EU is going to have strict rules." (17:22)
Moreover, there's a debate on whether Meta's reduced moderation will lead to a more open platform or degrade the user experience by allowing harmful content to proliferate unchecked.
The end of Facebook’s content moderation era marks a transformative chapter for Meta Platforms Inc., reflecting broader tensions between free expression and the responsibilities of social media companies. As Jeff Horwitz aptly summarizes, the debate hinges not just on the feasibility of content moderation but on the willingness of corporations to prioritize platform integrity over operational simplicity.
"So, I think this is less of a question of the feasibility of doing it than it is a question of whether anyone wants to do it at all." (15:52)
As Meta navigates this new landscape, the industry and its users keenly observe the outcomes, anticipating further shifts in policy and regulation that will shape the future of digital communication.
For more insights and detailed analyses, visit The Journal's official page.