Truth in the Barrel
Episode: Devil's Cut | Adventures In Misinformation w/ Renee DiResta
Hosts: Amy McGrath & Denver Riggleman
Guest: Renee DiResta, Associate Research Professor at Georgetown University, Author of Invisible: The People Who Turn Lies Into Reality
Release Date: October 21, 2025
Episode Overview
This episode dives into the modern landscape of misinformation and propaganda, how it has evolved with the rise of social media and AI, and the profound effects this has on public discourse and democracy. Amy McGrath interviews Renee DiResta—one of the world's leading experts on networked propaganda—on foreign interference, influencer ecosystems, the rise of AI-powered deepfakes, and practical steps individuals can take to resist manipulation. Key segments highlight Russia and China’s tactics, the nature of algorithm-driven division, and the urgent question of what ordinary people—as well as tech platforms—can do to combat the chaos.
Key Discussion Points & Insights
1. The Fracturing of the Information Ecosystem
[01:25–05:22]
- Amy McGrath: Sets the context—contrasts the old centralized news era (e.g., Walter Cronkite, Tom Brokaw) with today’s algorithmic, influencer-driven environment.
- Renee DiResta:
- Describes the shift from a "mass public" to fractured “niche publics,” each engaging with different creators and content streams.
- “The mass public has really fractured. ...You have different groups of people on different platforms paying attention to different creators.” [03:29]
- Draws on Chomsky’s Manufacturing Consent to explain how even legacy media curated information to suit advertisers and audiences.
- Uses the metaphor "carnival of mirrors" to characterize today’s media: people see distorted versions of themselves and their beliefs reflected back, often enhanced for engagement rather than truth.
2. Foreign State Actors: Russia and China
[06:30–15:34]
-
Platform Tactics:
- Russian state actors have long targeted “niche” American communities—including Black activist circles, Texas secessionists, and various age cohorts within conservatism—by impersonating community members.
- “Russian trolls wanted to run a strategy of division. ...They pretended to be members of these disparate groups.” [08:15]
- Used authentic-sounding personas (“as veterans, we ...”; “as Muslims, we...”) to amplify grievances and pit groups against each other.
-
Amplification:
- Russian-created content frequently retweeted or embedded by real influencers and even mainstream outlets (e.g., CNN unintentionally used Russian troll content as “vox populi”).
- “That was happening on the left too ...it’s that CNN ...embedded their tweets in articles as like vox populi ...and it turned out to be a Russian troll.” [11:00]
-
Shift to China:
- After Russian accounts were purged (post-2017), China increased activity, but their approach was more “broadcast era,” focusing on shifting perspectives about Chinese issues (e.g., treatment of Uyghurs), rather than infiltrating U.S. niches as effectively as Russia.
- “China ... still much more [is] putting out a style of propaganda that is much more of a broadcast era style.” [13:22]
3. Domestic Influencers, Profit, and Division
[16:03–19:39]
- Renee DiResta:
- Stresses that while foreign actors stoke division, homegrown influencers do the majority of spreading polarizing content, often for profit or clout.
- “The identity based model of ‘all Republicans are liars, all Democrats are liars’ ... [is] much more persuasive than any Russian troll.” [17:45]
- Outlines the “nut picking” phenomenon: large accounts spotlight obscure, outrageous examples from the opposite political camp to galvanize their audience.
- Algorithms and financial incentives now favor this divisive content amplification.
4. What Can Individuals Do?
[19:39–26:58]
-
Practical Resistance:
- Renee recommends that individuals recognize when they’re being manipulated for engagement and profit.
- “There has to be a bit of a social shift against that, a norm shift ... be muting those accounts rather than engaging with them.” [20:18]
- Muting or unfollowing divisive accounts hurts their reach and revenue—potentially shifting incentives on a larger scale.
-
Platform Tools & Middleware:
- Platforms (e.g., Instagram, Bluesky) starting to experiment with user controls to let people filter out certain content or tailor their feeds.
- Middleware: Third-party apps/extensions (e.g., Block Party) that empower users to mass-mute/curate/block toxic content or adjust their experience, especially valuable during targeted harassment.
- Acknowledges it can feel overwhelming to everyday users—middleware needs to become much more user-friendly and widespread.
5. The Challenge of Deepfakes and AI-Generated Content
[29:07–36:49]
-
AI tools are making it almost effortless to generate convincing fake images, videos, and personas.
-
Problematic for both public figures and ordinary people—deepfakes weaponized for parody, harassment, or misinformation.
-
Labeling and Detection:
- Major platforms (Meta, Google) considering or implementing automated labels for AI-generated content.
- Complications: Unintended “label fatigue,” the blurry line between benign edits and malicious manipulation, parody vs. impersonation.
- Open-source tools make detection and watermarking difficult; vulnerable populations at risk, especially women targeted by AI-fakes in adult content.
-
Notable quote:
“Is it useful to label something that is not really deceptive? ...You don’t want people to get label fatigue either where they just start scrolling past.” [35:30]
6. Motivation, Emotional Toll, and the Importance of the Work
[36:49–40:56]
-
Facing Harassment:
- Renee opens up about the challenges, lawsuits, and coordinated harassment she has faced.
- “You don't get targeted if you're not doing good work. … Let them be mad.” [37:50]
- Emphasizes the crucial need for strong research and public transparency, especially around elections and the normalization of lies.
-
Why Truth Matters:
- Fact-checking and labeling aren’t censorship—they are essential for informed democracy and collective action.
- “We cannot have a democracy if we can’t agree on the most basic facts. ... The democracies depend on peaceful transfer of power.” [40:10]
Notable Quotes & Memorable Moments
-
“The information ... has moved much more into this opinion-based, belief-based content ecosystem, where facts are maybe secondary to what is entertaining or what is ... likely to be receptive to that niche.” – Renee DiResta [05:56]
-
“They (Russian trolls) pretended to be members of these disparate groups ... and then pit these identities against each other.” – Renee DiResta [09:45]
-
“As that narrative took hold, ironically, domestically, we have made it much easier for these foreign actors to operate now.” – Renee DiResta [14:42]
-
“We are already doing this to ourselves. ... The influencer meanwhile profits ... the net effect is ... algorithms will reward it.” – Renee DiResta [17:13]
-
“Muting those accounts rather than engaging with them ... takes away their ability to earn money. ... It doesn’t solve the problem ... but it gives people some capacity in the moment.” – Renee DiResta [20:24]
-
“With public figures in particular, how to treat [deepfakes] is a real open question—because there are free speech rules ... but generally it’s not assumed you have the capacity to wholly impersonate them.” – Renee DiResta [31:29]
-
“We cannot have a democracy if we can't agree on the most basic facts.” – Renee DiResta [40:15]
Timestamps for Key Segments
- Old Media vs. New Media Ecosystem — [01:25–05:22]
- How Russia & China Operate Online — [06:30–15:34]
- Domestic Influencers and Division — [16:03–19:39]
- What Individuals Can Do — [19:39–26:58]
- AI, Deepfakes & Moderation — [29:07–36:49]
- Renee’s Motivation & The Toll of Truth-Telling — [36:49–40:56]
Conclusion
This episode unpacks the evolution of our fragmented information ecosystem, drawing a through-line from legacy media to today’s algorithmic “carnival of mirrors.” Through incisive examples, Renee DiResta highlights the intertwined role of foreign and domestic actors in spreading misinformation, the invigorated threat posed by AI-generated fakes, and the tension between free speech and truthful discourse online. Listeners learn both the scale of the problem and practical steps they can take—as well as the fundamental importance of committing to shared facts for the survival of American democracy.
Recommended reading: Renee DiResta’s Invisible: The People Who Turn Lies Into Reality
This summary omits non-content sections (ads, intros, outros) to focus on the substance of the conversation.
