Podcast Summary
Sean Carroll’s Mindscape Podcast
Episode 333 | Gordon Pennycook on Unthinkingness, Conspiracies, and What to Do About Them
Date: October 27, 2025
Host: Sean Carroll
Guest: Dr. Gordon Pennycook, Experimental Psychologist
Main Theme
This episode explores why people hold false beliefs, fall for misinformation, accept pseudo-profound “bullshit,” and become susceptible to conspiracy theories. Dr. Gordon Pennycook shares research that suggests our problem is less motivated reasoning (clinging to beliefs for identity or tribe) and more about "unthinkingness"—cognitive laziness or a tendency not to critically evaluate information. The conversation expands to the roles of intuition vs. analytic thinking, the pitfalls in our information environment, and the surprisingly optimistic potential for AI-powered chatbots to counteract conspiracy beliefs.
Key Discussion Points & Insights
1. What Is "Pseudo-Profound Bullshit"?
-
The IG Nobel Prize Research
- Pennycook was awarded an IG Nobel for his work studying why people find superficially profound statements (often made from jumbled buzzwords, e.g. "Hidden meaning transforms unparalleled abstract beauty") to be meaningful or deep.
- The study operationalized “pseudo-profound bullshit” by using both randomly generated statements and actual tweets from figures like Deepak Chopra, finding that the same people rate both as profound.
- [06:59 | Notable Quote]
Pennycook: “We took sentences like that, just random sentences. But we also took actual tweets from Deepak Chopra. … They sound pretty similar, obviously. And they are psychologically exactly the same. … The key part of the paper … was how do you measure one’s receptivity to this pseudo-profound form of bullshit?”
-
Definition Distinction
- Bullshit is not simply falsehood but an indifference to truth, per philosopher Harry Frankfurt—its goal is not to convey information, but to elicit a particular reaction or image.
2. The Psychological Basis: Unthinkingness, Not Just Bias
-
System 1 vs. System 2 Thinking
- Pennycook advocates for understanding two cognitive modes:
- System 1: Intuitive, fast, automatic
- System 2: Analytical, slow, effortful
- Most people, most of the time, rely on intuition—sometimes beneficial, but risky for complex or unfamiliar claims.
- [13:24]
Carroll: “It seems to me ... that most of our thinking is sort of subconscious, intuitive, quick System 1 stuff. And there’s only … a little bit of Super Effortful System 2 guidance at the top.” - Pennycook: “Exactly... Evolutionarily that makes sense ... But people vary in how much they do it. … There are cases in which you can overthink also. And so knowing when to expend effort is really the kind of the trick to making better choices.”
- Pennycook advocates for understanding two cognitive modes:
-
Variation Between Individuals
- Everyone relies on intuition. Some are generally more reflective and analytical, and others more intuition-driven. Context and domain expertise play a role.
- Even those who see themselves as “rational” have domains of unreflectiveness (e.g. sports fandom).
-
Testing Susceptibility/Reflectiveness
- Pennycook's studies use tasks designed to elicit intuitive but incorrect answers (e.g., “If you pass the person in second place in a race, what place are you in?” Correct answer: second, not first).
- Repeatedly, people prone to intuitive (unreflective) answering are also more susceptible to pseudo-profound statements, conspiracy theories, and distress in ambiguous situations.
3. Is Motivated Reasoning Overrated?
- Reconsidering the Dominant Narrative
-
While popular explanations for belief in misinformation/conspiracies emphasize identity and bias, Pennycook argues that the majority of susceptibility is due to “unmotivated reasoning” — people simply don’t invest the effort to reflect—even on beliefs that aren't central to their group identity.
-
[32:05 | Notable Quote]
Pennycook: “It is almost a truism … that the reason that we fall prey to political or otherwise, like, falsehoods is sort of because we want to ... But ... the reason why people are susceptible … is because they’re not really thinking that much about what they are engaging with ... That’s ... more about lazy thinking than ... that they’re so motivated that they’re spending all this extra effort convincing themselves that the things that they want to be true are true." -
[34:04 | Notable Quote]
Carroll: “So in a nutshell, you’re saying the problem is not motivated reason reasoning. It’s unmotivated reasoning. You’re not motivated to put in the work to reason.”
Pennycook: “Exactly. ... If we’re going to ... take the, the big pie of people believing things that are on bad evidence and false, a lot of it is just because they haven’t thought about it.”
-
4. Overconfidence and False Beliefs
-
General Overconfidence
- Pennycook’s work finds that people who are overconfident in their abilities (as measured by deliberately ambiguous/fuzzy tasks) are more likely to believe conspiracy theories.
- [41:51 | Notable Quote]
Carroll: “People who are prone to believing in the conspiracy theories are actually more likely to be overconfident in their beliefs generally than people who are more skeptical.”
Pennycook: “That is what we found.”
-
False Consensus Effect
- Conspiracy believers dramatically overestimate how widespread their beliefs are. For instance, only 8% believed the Sandy Hook “false flag” theory, but they estimated that 61% of people agreed with them.
- [49:08 | Notable Quote]
Pennycook: “In almost all cases, people who believe conspiracies think they’re in the majority, even in cases where less than 10% agree ... It’s the biggest false consensus effect I’ve ever seen.”
5. Political and Cultural Aspects
-
Political Asymmetry
- Pseudo-profound bullshit susceptibility has a small correlation with the political right, though the “new age” market is considered left-coded.
- There’s a much larger asymmetry with outright misinformation: more misinformation is produced and consumed on the right of the U.S. political spectrum, well-established across many studies, but this is partly an issue of information exposure and market, not just individual susceptibility.
- [27:44]
Pennycook: “There’s way more misinformation on the right than on the left... That sounds like a political statement. There’s ... dozens of studies that look [at this].”
-
Cross-cultural Consistency
- No strong evidence that susceptibility to pseudo-profound bullshit is culturally bound; similar effects are found in diverse samples.
6. Can People Change? Education & Interventions
-
Reflectiveness Can Be Contextually Enhanced
- While not much evidence yet for large, lasting training effects, nudges work: e.g., reminders about accuracy before sharing content online improve discernment.
- [39:19 | Notable Quote]
Pennycook: “We’ve done experiments where we just remind people about accuracy. … Then they’re better at distinguishing between the true and false stuff when they’re deciding what to share.”
-
Need for Certainty/Intellectual Humility
- People with higher need for certainty tend to be more susceptible to false beliefs; scientists are differentiated in part by their willingness to say “I don’t know.”
7. Explaining the AI Chatbot Intervention
-
The Chatbot Experiment
- Pennycook’s group had participants state their conspiracy beliefs in their own words and interact with AI chatbots trained to provide patient, evidence-rich counterpoints.
- Results: In about eight minutes, 25% of strong conspiracy believers renounced their belief; others lost confidence.
- [52:47 | Notable Moment]
Pennycook: “...after the conversation, which lasts about eight minutes, 25% of them don’t believe it anymore.” - These effects persisted after 1-2 months—no evidence the change decayed.
- There was also a “spillover effect”—people became slightly less likely to believe other conspiracies too.
-
What Works (and What Doesn’t):
- Evidence/facts drive change, not sweet talk. The politeness or friendly “flattery” of the chatbot is less important than its ability to present relevant, specific information.
- [56:07 | Notable Quote]
Pennycook: “It’s the facts that matter. … You cannot sweet talking somebody into changing their beliefs.”
-
AI as a Unique Tool
- Chatbots have infinite patience and broad knowledge, making them far better equipped than a human to address the Gish Gallop of conspiracy talking points.
-
Caveats:
- Current research used fact-checking to ensure the chatbot was accurate; “hallucinations” are rare for the class of popular conspiracy topics tested.
8. Broader Implications and Optimism
-
Information Environment is the Battleground
- If misinformation is winning, it’s more about our information environment than an inherent defect in human nature.
- [58:54 | Notable Quote]
Pennycook: “If we’re losing the misinformation war, it’s not because of people. It’s because of the information environment itself.”
-
Hopeful Outlook
- The majority of people are persuadable—wanting reasons, evidence, and discussion. AI might help satisfy this need in the scaled-up digital public sphere.
- Skepticism is contagious—a successful intervention on one conspiracy can slightly enhance skepticism toward others.
Memorable Quotes & Moments (with timestamps)
-
System 1 vs. System 2
Pennycook: “Knowing when to expend effort is really the kind of the trick to making better choices.” [13:24] -
Defining Bullshit
Pennycook: “You can bullshit about something that’s true … it’s about your orientation towards the truth …” [09:42] -
Overconfidence in Conspiracy Believers
Pennycook: “...those are the people who tend to be more likely to believe conspiracies ...” [43:01] -
Chatbots Can Deprogram
Pennycook: “...after the conversation, which lasts about eight minutes, 25% of them don’t believe it anymore.” [52:47] -
Conversation Makes People Think
Pennycook: “Simply going through the exercise of writing out the reasons for why you believe something is enough to kind of decrease how certain you are about it.” [56:49] -
Reflectiveness is Learnable (in context)
Pennycook: “We just remind people about accuracy… those little things can get people to be a little bit more reflective about the truth.” [39:19] -
Bigger Picture
Carroll: “There is a bias built into the system for ultimately true beliefs to prevail. I would like that to be true. … Maybe you’re giving me a little glimmer of hope here.” [66:56]
Noteworthy Segments & Timestamps
| Topic | Timestamp | |---|---| | Pseudo-profound bullshit research & IG Nobel | 04:31–09:42 | | Cognitive psychology: intuition vs. deliberation | 11:18–14:17 | | Effects of deliberate, analytic thinking | 13:24–17:03, 19:52–21:01 | | Overconfidence: link to conspiracy beliefs | 41:51–46:00 | | Political, cultural, and demographic notes | 24:19–27:44 | | Motivated vs. “unthinking” reasoning | 32:05–34:12 | | False consensus among conspiracy theorists | 46:00–49:08 | | AI chatbot experiments and their effects | 50:48–57:34 | | Long-term effects & "transfer" of skepticism | 60:26–62:45 | | Cautions about AI hallucinations | 62:45–64:16 | | Lessons and optimistic outlook | 65:32–69:48 |
Closing Takeaway
The central finding: Much susceptibility to misinformation and conspiracy theories is due to “unthinkingness”—not thinking deeply or critically about claims—rather than powerful tribal or motivated reasoning. The solution isn’t just more education or fighting tribalism, but cultivating habits and environments that nudge more reflective thinking. New research shows promise for AI chatbots as scalable “debunking” partners, giving humanity a surprisingly optimistic tool in the fight against widespread misinformation.
Guest: Dr. Gordon Pennycook | Host: Sean Carroll
Mindscape Podcast, Episode 333
October 27, 2025
