Trust Me: Cults, Extreme Belief, and Manipulation
Episode: Renee DiResta - Part 1: Propaganda, Disinformation, and Bespoke Realities
Release Date: December 3, 2025
Hosts: Lola Blanc & Meagan Elizabeth
Guest: Renee DiResta, Professor, Researcher, Author of Invisible Rulers: The People Who Turn Lies into Reality
Episode Overview
This episode of Trust Me delves deep into how propaganda, disinformation, and algorithm-driven social media dynamics are shaping what Renee DiResta calls our “bespoke realities.” DiResta, a leading researcher on information networks and manipulation, shares her personal journey into the world of anti-vaccine content, the rise of online conspiracy theories, and the systemic mechanisms by which falsehoods proliferate. Hosts Lola and Megan guide a conversation packed with clear explanations, vivid examples, and practical insights for understanding how propaganda operates in modern society—at both institutional and everyday levels.
Key Discussion Points & Insights
Renee DiResta’s Path Into Disinformation Research
- Personal Entry Point: Renee’s research was sparked as a new mother in 2013, when social media platforms began algorithmically pushing her “crunchy mom” and anti-vaccine content simply because she searched for things like homemade baby food and cloth diapers.
- “It started pushing me a lot of the stuff around, you know, make your own baby food, do cloth diapering. And then I did both of those things and so it was like, naturally you must be an anti vaxxer. It started pushing me the anti vaccine groups too. And I am not.” — Renee DiResta [13:33]
- She investigated further, joining groups with a clean account to map the pathway leading from lifestyle content to more radical conspiracy communities (e.g., Pizzagate, QAnon).
- Her background in tech and data science enabled her to analyze the structural and algorithmic factors behind these social dynamics.
The History & Nature of Propaganda
- Rumor Mill vs. Propaganda: Before the internet, information spread via informal rumor mills and top-down propaganda campaigns managed by figures of power (states, religious institutions, media owners).
- Propaganda's Origin: The term comes from the Catholic Church’s post-Reformation efforts to “propagate the faith.”
- “Propaganda is ... information with an agenda that benefits the person who is doing that propagation… there's always some sort of agenda in persuasive communication.” — Renee DiResta [17:32]
- Modern Evolution: The ability to reach millions is no longer reserved for elites; social media lets anyone become an influential node, collapsing the distinction between top-down and grassroots communication.
“Manufacturing Consent” & Media Filters
- DiResta explains Manufacturing Consent (Chomsky & Herman) and how pre-internet media was shaped by financial and political incentives, often leading to filtered coverage that favored powerful interests.
- Today, incentives are set not just by media owners, but also by social media companies' curation algorithms and the self-interested actions of influencers.
The Two-Step Flow Model of Influence
- Classic Finding: People’s beliefs are deeply shaped by trusted peers (“opinion leaders”), not just by what’s broadcast in the media.
- “There were a handful of women in the community who were really, really plugged into what was happening on the media. And then they would… talk with their friends about it.” — Renee DiResta [25:42]
- Charisma, relatability, and embeddedness in the community determine influence—qualities now mirrored and amplified by online influencers.
- Parasocial Dynamics: On social media, influencers simulate friendship, increasing their ability to shape followers' worldview.
Bespoke Realities & Algorithmic Amplification
- Definition: "Bespoke realities" are highly individualized information universes, shaped by our choices and algorithms. Platforms reinforce these by pushing more of what engages us, regardless of truth or extremity.
- “You can really choose to self-select into certain media and topical universes where this is reality for you… and it's algorithmically reinforced.” — Renee DiResta [30:53]
- Often, this process is unconscious; simply hesitating on certain content or clicking out of curiosity shifts what users see in the future.
- Serendipitous-seeming coincidences (e.g., being served exactly the right content “at the right time”) are constructed by data mining and prediction, not mystical insight.
Why Misinformation Spreads—And Why “Misinformation” Is the Wrong Word
- DiResta dislikes the term “misinformation,” arguing it implies a correctable fact error, when in reality these belief clusters are rooted in deep distrust.
- “It generally is like somebody who is wrong about something, like they get a fact wrong, but misinformation implies that if you just gave them a better fact, it would change their mind… It's a real trust issue. Right? It's very much an issue of trust.” — Renee DiResta [37:10]
- She prefers “propaganda,” which highlights the agenda and motivation behind crafted narratives.
- "Disinformation" is reserved for intentional state or organizational campaigns aimed at strategic disruption—e.g., Russia’s information warfare.
The Emotional & Identity Hooks of Viral Content
- Content that triggers intense emotion (especially anger/indignation or exclusivity of “secret knowledge”) is algorithmically favored.
- Social platforms amplify “extremophiles”—people with the most polarizing takes—leading to perceptions that extreme opinions are majority views (“majority illusion”).
- “You start to see these interesting phenomenon where people decide what is true ... based on what surrounds them. That's what starts to seem normal.” — Renee DiResta [42:27]
The Vicious Incentives of Attention Economies
- The most polarizing or emotionally charged content receives the highest engagement, creating a feedback loop for creators: to be seen, you must provoke.
- Even those aware of the system’s pitfalls feel forced to participate (“If you don’t do it, someone else will”).
- “If you don't do it, someone else will. And that is the issue of... the attention game.” — Renee DiResta [51:26]
The Limits of Individual Agency & Systemic Challenges
- Even with understanding, resisting the pull of these dynamics is tough. The onus should be on systemic accountability (regulation of algorithms, etc.), not just individual behavior.
- Experimentation with more user-controlled feeds (like on BlueSky) could help people reclaim control from blindly addictive engagement spirals.
Notable Quotes & Memorable Moments
- On Algorithmic Radicalization:
"I called it... inadvertent algorithmic radicalization... here's why you think you're joining a crunchy mom cloth diapering group, and then, like, Bam, here's the QAnon stuff that shows up in your feed." — Renee DiResta [13:33] - On Propaganda’s Modern Evolution:
“In the age of the Internet, these two things happen in the same place at the same time. That power to spread messages... is something that we actually all have now.” — Renee DiResta [19:40] - On Emotional Virality:
“Morally righteous indignation is the tone that it keys off of. And people will respond to that, unfortunately, because they... feel like they are also in the fight.” — Renee DiResta [51:58] - On the Cost of Self-Expression:
“You see the moderates, the people who... don't hold extreme beliefs actually become more and more silent. They choose to self censor rather than to speak up.” — Renee DiResta [45:57] - On “Serendipitous” Algorithms:
“People also don't necessarily realize just how much data mining is going into what they think is serendipitous and personalized for them.” — Renee DiResta [35:31] - On Systemic vs. Individual Responsibility:
“While it’s important and good for us to all try to do what we can on an individual level, ultimately, like, these things are happening on a systemic level that corporations are spearheading to try to make more money ... Let's hold people accountable, lobby our politicians to actually regulate that shit.” — Host Lola Blanc [06:11]
Timestamps for Important Segments
- Personal Origin Story & Algorithmic Radicalization: [13:33]–[16:54]
- History of Propaganda & Rumor Mills: [17:32]–[19:40]
- Manufacturing Consent & Media Filters: [20:49]–[22:42]
- Two-Step Flow & Influence: [25:42]–[30:34]
- Bespoke Realities & Algorithmic Reinforcement: [30:53]–[34:28]
- On “Serendipitous” Feeds & Data Mining: [35:01]–[36:27]
- Why “Misinformation” is the Wrong Term: [37:10]–[39:49]
- Majority Illusion & Extreme Content: [42:27]–[46:13]
- Incentives Driving Polarization: [51:26]–[52:44]
- Systemic Solutions & User Agency: [54:44]–[54:46]
Episode Vibe & Tone
This episode is smart, compassionate, and darkly funny—rooted in the lived experience of two hosts who’ve escaped cultic influence and an expert who brings both personal and technical rigor. The conversation is urgent yet practical, inviting listeners to reflect on their own consumption habits and the larger systems shaping their reality.
For Next Week (Part 2 Teaser)
- The conversation continues into pseudo-events, nonsense controversy, the challenge of discerning truth from fiction, and the rapidly degrading landscape of internet “reality.”
For listeners who want to understand the modern information battleground—how belief, identity, algorithms, and propaganda interlace in our online lives—this episode is essential, accessible, and galvanizing.
