Podcast Summary
Podcast: Trust Me: Cults, Extreme Belief, and Manipulation
Episode: Renee DiResta – Part 2: Pseudo-Events, More on Disinformation, and Telling Fact from Fiction
Date: December 10, 2025
Hosts: Lola Blanc & Megan Elizabeth
Guest: Renee DiResta, professor and author of Invisible: The People who Turn Lies into Reality
Episode Overview
This episode, a continuation from last week's conversation, welcomes back Renee DiResta for an in-depth exploration into how manufactured controversies, pseudo-events, and viral misinformation shape our understanding of reality. The conversation digs into the mechanics of social media disinformation, why fact and fiction are so hard to separate, and what real- and bad-faith actors stand to gain from social division—supplemented by insights from DiResta’s latest research and book. The tone is candid, darkly humorous, and incisive, with a palpable urgency about the manipulation and isolation fostered by algorithms and online echo chambers.
Key Topics & Discussion Points
1. Social Media Participation: The 90-9-1 Rule
(Starts at 13:04)
- Renee DiResta explains:
- 90% of users are lurkers who don’t post.
- 9% contribute occasionally.
- 1% generates the majority of content: "That is what you are seeing...a very tiny percentage of people who are actually out there as content creators." [13:11, Renee DiResta]
- Algorithmic amplification:
- The top 1% rises because “what they're posting is extreme and emotionally charged. And things that are emotionally charged spread faster.” [14:03, Lola Blanc]
- Platform manipulation (e.g. TikTok’s “heating button,” LinkedIn influencer invites).
2. Manufactured Outrage and Pseudo-Events
(Deep dive starts at 17:37)
- Pseudo-events defined:
- Coined by Daniel Boorstin – events created solely for media attention (“the hype generates the newsworthiness itself”). [18:06, Renee DiResta]
- Examples: Ribbon-cuttings, “Kim Kardashian breaking the Internet,” artificially extended news cycles.
- On social: "So much of what you pay attention to on social media is complete bullshit." [18:59, Renee DiResta]
- Viral outrage cycles:
- Example: “Bean Dad”—a mundane anecdote that escalated into a spiral of online outrage.
- Example: Woman posts about coffee with her husband, Internet attacks her for “privilege.”
- "You’d open the app and be like, what are we mad about today?" [21:44, Renee DiResta]
- Real-world consequences:
- Small incidents go viral, leading to doxxing or harassment, even though they don’t merit the global attention they receive.
3. Decontextualized and Fake Media: Fact vs. Fiction
(Segment starts at 24:34)
- Viral misinformation:
- Videos and photos frequently go viral with misleading or fabricated captions.
- Caution from Lola: “If you’re looking at a video and there is text...do not assume that the text is accurate.” [24:34, Lola Blanc]
- Old footage recirculated:
- Case study: Video claimed to show ICE raids, actually an old custody dispute reframed for viral impact.
- "Real and true are not the same thing...Real is: did something happen? True is: is it used in the right context?" [27:02, Renee DiResta]
- Motivations:
- Some intentionally “rage-bait” for clicks.
- Many are duped into spreading misinformation sincerely, thinking they’re helping.
4. Who Benefits? State, Politicians, and Platforms
(Delves in around 29:10)
- State actors:
- Russia and others exploit pre-existing cultural fissures:
- “[They’re] not creating social divisions out of whole cloth. They’re using stories to exploit divisions that are already there.” [29:58, Renee DiResta]
- Russia and others exploit pre-existing cultural fissures:
- Domestic amplification:
- Politicians latch onto viral rumors for political gain.
- Example: 2024 “eating pets” rumor in Springfield, Ohio—how a racist Facebook rumor escalated when picked up by political candidates, resulting in bomb threats and national media fixation.
- "We used to see our political leaders being the firebreak...instead, Vance does the opposite." [34:31, Renee DiResta]
5. How Do We Navigate Disinformation?
(Main solutions segment at 35:56)
- Lack of clear answers:
- “There was not really an easy answer there at the time.” [36:12, Renee DiResta]
- Decentralized cults (e.g. QAnon) are especially hard to counter.
- Storytelling & trust:
- The key is building alternative narratives of meaning and trust, not just “facts.”
- Positive role models: Dr. Mike, Hank Green, “your local epidemiologist”—using story and personality to deliver reliable public health info.
- “People are not looking for facts. They’re looking for someone to help them understand, someone they feel they can trust.” [39:58, Renee DiResta]
6. The Limits of Platform & Regulatory Solutions
(Segment at 44:19)
- Current regulatory issues (US):
- Platforms curate content as they see fit; regulation is minimal and polarized.
- Lack of interoperability keeps users locked in and resistant to leaving toxic spaces.
- “Transparency laws here are really abysmal.” [48:04, Renee DiResta]
- Influencer disclosure, especially for political content, is inadequate.
- European context:
- Europe mandates some transparency and “right to appeal” content removals.
- Greater push for data portability and interoperability.
Notable Quotes & Memorable Moments
-
“So much of what you pay attention to on social media is complete bullshit.”
- Renee DiResta [18:59]
-
“If you’re looking at a video and there is text written on it about what is happening, do not assume that the text is accurate for what is happening.”
- Lola Blanc [24:34]
-
“Real and true are not the same thing. Real is: did a machine make it, or is it authentic? True is: is it used in the correct context?”
- Renee DiResta [27:02]
-
“We used to see our political leaders being the sort of firebreak...Now...they're the ones amplifying the rumor.”
- Renee DiResta [34:31]
-
“People are not looking for facts. They’re looking for someone to help them understand—someone they feel they can trust.”
- Renee DiResta [39:58]
Recommendations & Final Thoughts
- On resisting manipulation:
- Question emotionally manipulative content—especially if it triggers a strong response.
- Diversify sources; don’t rely on a single “influencer” or algorithm.
- On positive change:
- There’s hope in influencer communicators who use storytelling and empathy to bridge knowledge gaps.
- On structural reform:
- Push for transparency, interoperability, and real accountability for platforms and influencers.
- On personal vigilance:
- Lola: “If it makes me feel strong feelings, the first thing I do is look it up and make sure…so many times it’s just an honest mistake.” [52:27]
Timestamps for Key Segments
- Intro & Overview: 01:40
- The 90-9-1 Rule: 13:04
- Pseudo-Events & Manufactured Outrage: 17:37
- Viral Videos & Decontextualization: 24:34
- State & Domestic Actors Fueling Division: 29:10
- Case Study – 2024 Eating Pets Rumor: 30:54
- Navigating a Disinfo World: 35:56
- Solutions—Influencers, Storytelling, and Regulation: 39:54, 44:19
- Final Reflections: 48:58
Closing Sentiment
The episode closes with mutual admiration between the hosts and guest, a sense of urgency about the stakes (“more good efforts, more laws”—but also grassroots media literacy), and a resigned but determined mood about the challenges ahead:
"Remember to follow your gut. Watch out for red flags. And never ever trust me." [53:17]
This summary preserves the candid and forthright tone of the conversation, collates major insights, and serves as a practical guide for understanding how and why disinformation spreads—and what we can do about it.
