Podcast Summary: The Rational Reminder Podcast
Episode 361: Alex Edmans – Finding "The Truth" in Economics, Finance, and Life
Date: June 12, 2025
Host: Benjamin Felix
Guest: Professor Alex Edmans (London Business School)
Overview
In this episode, Benjamin Felix interviews Alex Edmans, Professor of Finance at London Business School, about his new book May Contain: How Stories, Statistics and Studies Exploit Our Biases and What We Can Do About It (2024). The core theme is the rising difficulty of finding truth in an era overflowing with information, where even facts, studies, and expert statements are regularly misunderstood, misrepresented, or weaponized. The conversation centers on Edmans' concept of the "ladder of misinference," exploring how we misinterpret information at various levels and what we can do to guard against personal and societal biases.
Key Discussion Points
1. The Prevalence and Dangers of Misinformation ([03:41]–[07:12])
- UK Lawmaking & Misinformation Example:
- Edmans recounts witnessing UK legislation being influenced by a misinterpretation of economic evidence about CEO and worker pay gaps ([03:41]).
- The incorrect evidence was cited in parliamentary committees and ultimately made its way into law, despite Edmans submitting a clarification ([05:39]).
- Memorable Quote:
"Even evidence from a supposedly trusted source like a UK government report may well not be gospel."
— Alex Edmans ([06:54])
- Post-Truth World:
- The problem isn't simply outright lies, but truths presented in misleading ways—truths that have been superseded, taken out of context, or selectively cited ([07:17]).
2. Confirmation Bias and Its Mechanisms ([08:27]–[12:09])
-
Nature of Confirmation Bias:
- Your intelligence doesn’t insulate you from confirmation bias—it’s a universal human trait.
- Brain studies show that confirming information feels rewarding (dopamine), while disconfirming info triggers discomfort (fight-or-flight responses) ([09:46]).
- Notable Quote:
"We are actually addicted to seeing things that confirm what we'd like to be true."
— Alex Edmans ([09:46])
-
Practical Strategies:
- Actively seek disconfirming evidence.
- Ask yourself: "Do I want this to be true?"—if so, be extra skeptical ([11:14]).
- Notable Quote:
"If there is a study or a claim on something that's important and you really want it to be true, then you need to be particularly careful."
— Alex Edmans ([11:26])
-
Severity and Outcomes:
- Studies show confirmation bias is the leading cause of investigative failures, such as false criminal convictions ([12:09]).
3. Selective Interpretation and Biased Search ([13:52]–[18:30])
-
Interpretation vs. Search Bias:
- People don't just interpret facts selectively—they also seek out information that affirms their beliefs, reinforcing echo chambers ([15:15]).
- The way you frame a question in Google or AI tools (e.g., "Is red wine good for you?") shapes the answers you find ([15:19], [18:30]).
-
Anecdote:
- Edmans shares a story about a student hesitant to challenge him due to cultural and organizational norms, emphasizing the importance of inviting dissent ([17:00]).
-
Limitations of Generative AI:
- AI can reinforce bias by providing answers that align with the user’s interests unless deliberately challenged ([18:30]).
4. Expertise and Knowledge—A Double-Edged Sword ([19:39]–[21:23])
- Expertise doesn’t protect against bias; in fact, it may make you better at rationalizing away disconfirming evidence ([19:39]).
- Deepwater Horizon Example:
- Highly trained engineers invented the "bladder effect" to dismiss failed safety tests, ultimately leading to disaster ([20:30]).
5. Black and White Thinking and Its Consequences ([21:29]–[25:40])
- How Binary Thinking Skews Reality:
- Black-and-white thinking drives people to overly simple solutions (e.g., Atkins diet demonizing all carbs).
- Memorable Quote:
"To Pennabastella, Atkins did not need to be right, he just needed to be extreme."
— Alex Edmans ([23:16])
- Dangers of Oversimplification:
- Even "good" things (like water or employee autonomy) can be harmful when taken to extremes ([23:26]).
- Nuance and the “middle ground” are often ignored in both popular culture and management thought.
6. The Ladder of Misinference ([25:50]–[59:14])
- Purpose:
- To make the various types of misinformation digestible and memorable, Edmans introduces four major "rungs":
- Statement is not fact
- Fact is not data
- Data is not evidence
- Evidence is not proof
- To make the various types of misinformation digestible and memorable, Edmans introduces four major "rungs":
(1) Statement is Not Fact ([26:08]–[32:18])
- People often accept statements from gurus or celebrities as "truth" without evidence (e.g., "Culture eats strategy for breakfast").
- Myth Example:
- "10,000 Hour Rule"—well-cited, but the research behind it is deeply flawed ([28:45]).
- Quote:
"Even if [a statement] is said to be based on data and science, you could just be quoting selectively..."
— Alex Edmans ([32:27])
(2) Fact is Not Data ([38:18])
- Anecdotes (facts) are vivid but can be cherry-picked (Sinek's "Start with Why" examples).
- True knowledge demands considering all relevant cases, not just successes ([40:47]).
(3) Data is Not Evidence ([44:36])
- Large datasets can imply correlation, not causation.
- Alternative explanations for findings are often ignored.
- Data Mining Warning:
- Researchers may search for measures that confirm their thesis among many (e.g., searching through 24 different diversity measures and reporting only the one that "works").
- Real-World Impact Example:
- Funds launched using illusory correlations between gender diversity and returns have underperformed ([48:45]).
(4) Evidence is Not Proof ([54:40])
- Even when causality can be established—for example, with a natural experiment—the result might only hold in a specific context.
- Many studies are conducted on "WEIRD" (Western, Educated, Industrialized, Rich, Democratic) populations and aren't universally generalizable ([58:09]).
- Memorable Quote:
"A proof is universal... Often when we look at evidence, it might be gathered only in one particular setting."
— Alex Edmans ([55:54])
7. Seeking Truth in the Real World ([59:14]–[63:08])
- Checklist for Discernment:
- Ask: Is there evidence? Is it one anecdote or large-scale data? Could there be alternative explanations? What is the context and setting?
- Appreciate and Encourage Dissent:
- Leaders should explicitly welcome challenges and recognize dissent to foster better decision-making ([59:28]).
- Society & Education:
- The value of teaching critical thinking, alternative explanations, and nuance from a young age ([61:41]).
- Analogy:
“Just like we teach a kid, don't accept sweets from strangers, there's an alternative explanation... Given that kids are able to grasp that, I don't think it's unrealistic to teach them the power of thinking about alternative escalations.”
— Alex Edmans ([61:41])
Notable Quotes
- "We are actually addicted to seeing things that confirm what we'd like to be true." — Alex Edmans ([09:46])
- "The solution to misinformation is not for everybody to get a PhD in statistics... The skills for discernment are already within us." — Alex Edmans ([14:01])
- "To Pennabastella, Atkins did not need to be right, he just needed to be extreme." — Alex Edmans ([23:19])
- "You can find a study to show whatever you want to support. What matters is the quality of the research." — Alex Edmans ([51:38])
- "A proof is universal... Evidence might be gathered only in one particular setting." — Alex Edmans ([55:54])
- "Just like candy is appealing, we want to accept it. We don't want to ask the question... we simply would like to believe that study and to consume it." — Alex Edmans ([63:15])
Key Timestamps
- 03:41–07:12: Misinformation in UK Lawmaking
- 08:27–12:09: Nature and Power of Confirmation Bias
- 15:15–18:30: Biased Search, Echo Chambers, and AI/Google Risks
- 19:39–21:23: Expertise Can Increase Defense of Prior Beliefs
- 21:29–25:40: Dangers of Black and White Thinking
- 25:50–59:14: The Ladder of Misinference Explained
- 10,000 Hour Rule ([28:45])
- Why We Sleep ([32:27])
- Cherry-picking Facts/Data ([40:43])
- Data Mining ([46:28])
- Diversity Fund Example ([48:45])
- 59:14–63:08: Toolkit for Seeking the Truth and Societal Implications
Natural Tone & Language
Both Ben and Alex keep the conversation accessible yet rigorous, using vivid real-world stories, practical analogies, and a good mix of humor and humility (“I myself have fallen for this,” etc.). They don’t shy away from self-critique or acknowledging the messiness of truth-finding in finance, economics, or life.
Conclusion
Alex Edmans’ conversation is a strong reality check for anyone who wants to make better, rational decisions in a noisy, biased world. The "ladder of misinference" presents a practical framework for critical thinking, helping listeners move from deference to authority or viral anecdotes toward a more scientific, skeptical, and open-minded approach—an approach crucial not just for individual investors, but for organizations and society as a whole.
