Podcast Summary: Ep. 225: Debating Social Media Content Moderation
So to Speak: The Free Speech Podcast
Host: Nico Perino
Guests: Renee Diresta and Jonathan Rauch
Release Date: September 26, 2024
Introduction
In Episode 225 of So to Speak: The Free Speech Podcast, host Nico Perino engages in a profound discussion with Renee Diresta and Jonathan Rauch on the intricate and contentious issue of social media content moderation. The conversation delves into the ethical imperatives, practical challenges, and broader societal implications of moderating content on digital platforms.
Social Media Content Moderation: Perspectives
Jonathan Rauch's Framework
Jonathan Rauch initiates the conversation by outlining his perspective on content moderation. He emphasizes the multifaceted nature of social media companies, describing them as platforms, corporations, communities, and publishers simultaneously. Rauch argues that while free speech values are paramount, content moderation is an inevitable responsibility for these platforms.
Jonathan Rauch [03:20]: "Social media companies are a hybrid of four different kinds of institutions... they are publishers. Publishers aggregate content, but then curate it in order to assemble audiences to sell to advertisers."
Rauch contends that moderation should be seen as a mechanism to maintain community standards and advertiser relationships, rather than purely a tool of censorship.
Renee Diresta's Views
Renee Diresta concurs with the necessity of content moderation but offers a nuanced perspective. She distinguishes between different types of content moderation actions—removal, reduction, and information—and criticizes the simplistic labeling of all moderation as censorship.
Renee Diresta [06:31]: "Content moderation is an umbrella term for a vast array of topical areas that platforms choose to engage around... When they moderate, as John alludes to, they have three mechanisms they can use for enforcement."
Diresta argues for a focus on the definition and understanding of harm, advocating for moderation practices that aim to mitigate genuine public harm rather than suppress diverse opinions indiscriminately.
Election Integrity Partnership
Overview
Renee Diresta elaborates on the Election Integrity Partnership, a research initiative aimed at identifying and addressing election-related misinformation. The project involved students and analysts submitting content deemed violative of platform policies for moderation.
Renee Diresta [12:24]: "We sort of coded, basically, here's the policy. Here are the platforms that have implemented this policy..."
Results and Findings
Upon reviewing the submitted content, Diresta reveals that a significant majority of the flagged URLs remained unactioned by platforms.
Renee Diresta [16:11]: "After the election was over... 65% of the URLs... had not been actioned at all."
This outcome underscores inconsistencies in policy enforcement and raises questions about the efficacy and impartiality of content moderation practices.
The Deplatforming of Donald Trump
Context and Debate
The discussion shifts to the controversial deplatforming of former President Donald Trump following the events of January 6th. The conversation highlights the lack of consensus within the free speech community regarding whether Trump's speech constituted incitement under the First Amendment.
Nico Perino [37:33]: "The question as to whether Donald Trump's speech on the ellipse that day met the standard for incitement under the First Amendment is like the hottest debate topic."
Diresta points out the difficulty platforms faced in applying their policies consistently to high-profile figures.
Renee Diresta [38:42]: "This was one of the areas where there was... a very significant concern that there would be continued political violence."
Implications
The deplatforming case exemplifies the complexities and potential double standards in content moderation, especially when dealing with influential public figures.
Government Pressure and Transparency
Government Influence
The episode addresses instances where government entities allegedly pressured social media companies to censor specific content, notably surrounding COVID-19 information.
Nico Perino [56:07]: "I just want to read Mark Zuckerberg's Aug. 26 letter to Jim Jordan... expressed a lot of frustration with our teams when we didn't agree."
Diresta clarifies the nature of projects like the Election Integrity Partnership, emphasizing the academic and independent stance of their work despite some government involvement through ticket submissions.
Renee Diresta [16:24]: "There is nothing in the ticketing in which a government agency sends something to us and says, you need to tell the platforms to take this down."
Transparency Issues
Both guests advocate for greater transparency in how content moderation decisions are made, suggesting that public registries of moderation requests could enhance trust.
Jonathan Rauch [50:47]: "Maybe for now, the best clutch is going to be private outside groups in universities and nonprofits that do their best to look at what's going up on social media sites..."
Impact on Public Trust and Vaccine Hesitancy
Public Perception
Perino raises concerns about how content moderation during the COVID-19 pandemic may have eroded public trust in institutions like the CDC.
Nico Perino [33:09]: "I suspect that the actions taken by social media companies during the COVID era eroded trust in the CDC and other institutions."
Diresta counters by noting that while platforms have policies against misinformation, the actual removal rates were low, suggesting that the narrative of pervasive censorship may not be entirely accurate.
Renee Diresta [48:42]: "Most of the time, they didn't actually do do anything."
Vaccine Hesitancy
The discussion touches on whether content moderation efforts inadvertently fueled vaccine hesitancy by creating a "forbidden fruit" effect.
Nico Perino [33:09]: "...erodes trust... would have been much better to let the debate happen without social media companies placing their thumb on the scale."
Diresta acknowledges the complexity but believes that responsible moderation aligned with public health needs is ethically justified.
Policy Recommendations and Future Directions
Enhancing Transparency
Both guests agree that increasing transparency in content moderation processes is crucial. Diresta suggests publicly accessible records of content moderation requests to prevent perceptions of bias or hidden agendas.
Jonathan Rauch [61:24]: "...anyone can go and look at the briefings... it's about transparency."
Legislative Solutions
Rauch and Diresta advocate for statutory solutions to regulate government influence on social media moderation, ensuring formal and transparent channels for any interactions between government entities and platforms.
Jonathan Rauch [63:11]: "This decision should be made in Congress, not the courts."
Discussion on Terminology: Misinformation vs. Alternatives
Critique of "Misinformation"
Renee Diresta expresses disdain for the term "misinformation," arguing that it conflates factual accuracy with intent, leading to ambiguity and subjectivity in scholarly research.
Renee Diresta [65:03]: "It is a garbage term because it turns something into a debate about a problem of fact."
Preferred Terms
Diresta advocates for more precise terminology such as "rumors" and "propaganda" to accurately describe the nature and intent behind certain types of content.
Renee Diresta [67:22]: "We have had terms to describe unofficial information with a high degree of ambiguity... That’s a rumor, propaganda..."
Rauch, however, maintains that the term "misinformation" is still valuable for distinguishing between true and false information, essential for fostering objective knowledge.
Jonathan Rauch [68:47]: "I will be happy if we can get the general conversation about this just to the point of people understanding this is a wicked hard problem."
Conclusion
Episode 225 of So to Speak: The Free Speech Podcast offers a comprehensive exploration of the multifaceted challenges surrounding social media content moderation. Through insightful dialogue, host Nico Perino, along with guests Renee Diresta and Jonathan Rauch, illuminate the delicate balance between upholding free speech principles and ensuring community safety and integrity. The conversation underscores the need for nuanced policies, enhanced transparency, and ongoing discourse to navigate the evolving landscape of digital expression.
Notable Quotes:
-
Jonathan Rauch [03:20]: "Social media companies are a hybrid of four different kinds of institutions... they are publishers."
-
Renee Diresta [06:31]: "Content moderation is an umbrella term for a vast array of topical areas that platforms choose to engage around."
-
Nico Perino [37:33]: "The question as to whether Donald Trump's speech on the ellipse that day met the standard for incitement under the First Amendment is like the hottest debate topic."
-
Renee Diresta [16:11]: "After the election was over... 65% of the URLs... had not been actioned at all."
-
Jonathan Rauch [61:24]: "This decision should be made in Congress, not the courts."
For more insights and detailed discussions, subscribe to So to Speak: The Free Speech Podcast on your preferred platform.
