Weaponizing Uncertainty: How Tech is Recycling Big Tobacco’s Playbook
Episode Release Date: March 20, 2025
Host/Authors: Tristan Harris and Daniel (Co-host)
Guest: Naomi Oreskes, Historian and Author of Merchants of Doubt and The Big Myth
Produced By: Julia Scott, Joshua Lash, Sasha Fegan
Affiliation: The Center for Humane Technology, TED Audio Collective
Introduction
In this compelling episode of Your Undivided Attention, hosts Tristan Harris and Daniel delve into the alarming parallels between the tactics employed by Big Tobacco to sow doubt and the strategies modern tech industries use today, particularly in the realm of artificial intelligence (AI). Historian Naomi Oreskes joins the conversation to shed light on these manipulative practices and discuss pathways to foster informed decision-making amidst pervasive uncertainty.
Historical Context: Merchants of Doubt
Naomi Oreskes, renowned for her influential work, Merchants of Doubt, provides a historical overview of how certain industries have strategically manipulated scientific uncertainty to delay regulation and protect their interests.
-
Origins of Doubt-Mongering:
Oreskes explains that the original "Merchants of Doubt" were prominent Cold War physicists who, despite lacking expertise in public health or climate science, actively undermined scientific consensus across various industries, including tobacco and fossil fuels.
“They raised questions in the pages of the Wall Street Journal or Fortune and Forbes, not in peer-reviewed journals,” [02:41] Naomi Oreskes. -
Ideological Motivations:
These scientists were driven not by scientific skepticism but by ideological fears, particularly the dread that government regulation could lead to communism. This ideological stance was a cornerstone in their campaigns against scientific findings that necessitated regulatory actions.
“They feared that if the government became more involved in regulating the economy through environmental or public health regulation, it would be a backdoor to communism,” [07:43] Naomi Oreskes.
Tactics Used by Industries to Manufacture Doubt
Naomi Oreskes elaborates on the sophisticated and multifaceted strategies industries employ to create and maintain public doubt about scientific consensus.
1. Creating Uncertainty
-
Strategic Emphasis on Doubt:
Instead of directly disputing scientific findings, industries promote the notion that the science is unsettled, thereby justifying inaction.
“They say, well, we don't really know, we need more data, we should do more research,” [08:49] Naomi Oreskes. -
Manipulating Scientific Processes:
By inviting scientists to debates in non-scientific forums, they create an illusion of controversy where scientific consensus already exists.
“By agreeing to debate, the scientist loses before he or she has even opened their mouth... the audience says, oh, there is a debate,” [10:48] Naomi Oreskes.
2. Astroturfing
-
Fake Organizations:
Industries establish seemingly independent organizations that appear to represent public interests but are actually funded and controlled by the industries themselves.
“Citizens for Fire Safety was created and wholly funded by the tobacco industry to fight tobacco regulation,” [13:35] Naomi Oreskes. -
Distracting Public Focus:
These organizations divert attention from the harmful products to other less impactful issues, thereby diluting the impact of regulatory measures.
“They persuaded people to pass regulations requiring flame retardants in pajamas, shifting blame from cigarettes to bedding materials,” [14:35] Naomi Oreskes.
3. Emphasizing Personal Responsibility
-
Shifting Blame to Individuals:
By promoting the idea that harmful outcomes are the result of individual choices rather than systemic issues, industries absolve themselves of responsibility.
“It's up to you to have personal responsibility with how many Doritos you have,” [16:46] Tristan Harris. -
Misrepresenting Data for Perceived Agency:
Campaigns like personal carbon footprint calculators encourage individuals to take action, thereby distracting from the need for broader systemic changes.
“Reducing your carbon footprint isn't bad, but it shifts agency away from producers,” [19:47] Naomi Oreskes.
Implications for Modern Technology and AI
The conversation transitions to the application of these tactics in today's tech landscape, particularly concerning AI.
1. Recycling the Big Tobacco Playbook
-
Similar Strategies in Tech:
The tech industry, especially in AI, employs doubt-mongering to stave off regulation, mirroring Big Tobacco’s historical strategies.
“We are seeing 30 years of organized disinformation and campaigns to prevent governments from doing what they promised,” [26:30] Naomi Oreskes. -
AI-Specific Concerns:
Frontline AI labs counter safety concerns by labeling critics as out-of-touch or overly cautious, thereby undermining legitimate fears about AI risks.
“We believe in a science-based approach to studying AI risk, which basically meant pre-framing all of the people who are safety concerned as sci-fi oriented,” [29:39] Tristan Harris.
2. The Challenge of Rapid Technological Advancement
-
Speed vs. Science:
Technological advancements, particularly in AI, outpace scientific understanding, complicating the regulatory landscape.
“The technology moves faster than the science,” [32:26] Daniel. -
Adaptive Management as a Solution:
Drawing parallels to the Montreal Protocol, Oreskes advocates for adaptive management strategies that allow regulations to evolve with emerging scientific insights.
“Adaptive management is about acting on what we know now but being prepared to adjust in the future,” [25:27] Naomi Oreskes.
Adaptive Management as a Path Forward
Naomi Oreskes emphasizes the necessity of flexible, responsive regulatory frameworks that can adapt to new information and evolving technologies.
-
Lessons from the Montreal Protocol:
The successful regulation of the ozone hole exemplifies how adaptive management can effectively integrate scientific findings with policy-making.
“The Montreal Protocol had a feature for adaptive management, allowing regulations to tighten or relax based on new information,” [25:08] Naomi Oreskes. -
Establishing Commissions with Sunset Clauses:
For emerging technologies like AI, Oreskes suggests creating commissions with defined terms that can be reviewed and renewed based on their effectiveness and the latest scientific data.
“Maybe for AI, we should have some kind of commission on AI safety that has a 10-year term,” [43:28] Naomi Oreskes.
The Importance of Trustworthiness and Inclusive Governance
Building trust and ensuring diverse representation in regulatory bodies are crucial for effective governance of emerging technologies.
-
Balancing Expertise and Diverse Perspectives:
While technologists possess deep technical knowledge, incorporating policy experts, stakeholders, and social scientists ensures that regulations are comprehensive and considerate of broader societal impacts.
“Technologists know the most about the technology, but they don’t necessarily know the most about how these things will influence the users,” [34:24] Naomi Oreskes. -
Countering Epistemic Privilege:
Recognizing that industry experts may have conflicting incentives, inclusive governance structures can mitigate biased decision-making and promote public interest.
“There's more than one kind of expertise needed, and technical expertise is necessary but not sufficient,” [35:29] Naomi Oreskes.
Fostering Public Resilience Against Disinformation
The episode concludes with strategies for individuals to navigate a complex information landscape fraught with manufactured doubt.
-
Critical Thinking and Questioning Motives:
Oreskes encourages listeners to critically assess the sources of information by questioning who benefits from specific narratives.
“Ask, who benefits from what they're saying and what is their interest?” [46:48] Naomi Oreskes. -
Building Trust Through Effective Communication:
Scientists and experts must engage with the public empathetically, listening to concerns and presenting information in accessible ways to rebuild trust.
“Scientists need to relate to people on a human level and recognize the importance of listening,” [48:56] Naomi Oreskes. -
Embracing Adaptive Management in Daily Decision-Making:
Just as adaptive management applies to policy, individuals routinely make decisions amidst uncertainty by trusting their judgment based on available information.
“We live with uncertainty in our daily lives, making judgments based on the information we have,” [20:36] Naomi Oreskes.
Conclusion and Key Takeaways
-
Recognition of Manipulative Tactics:
Understanding the historical and contemporary strategies used to manufacture doubt is essential in combating misinformation and fostering informed public discourse. -
Adaptive and Inclusive Regulatory Frameworks:
Implementing adaptive management and inclusive governance models can better address the rapid pace of technological advancements and their societal implications. -
Empowering Individuals with Critical Tools:
Encouraging critical thinking, questioning motives, and fostering trust in credible sources are vital in building resilience against disinformation. -
The Role of Collective Effort:
Addressing complex issues like climate change and AI requires collaborative efforts that integrate diverse expertise and prioritize the common good over individual or corporate interests.
Naomi Oreskes' insights provide a clarion call to recognize and counteract the strategies that undermine scientific consensus, ensuring that society can navigate the challenges of emerging technologies with clarity and purpose.
Notable Quotes with Timestamps:
-
“They raised questions in the pages of the Wall Street Journal or Fortune and Forbes, not in peer-reviewed journals.”
— Naomi Oreskes [02:41] -
“They feared that if the government became more involved in regulating the economy through environmental or public health regulation, it would be a backdoor to communism.”
— Naomi Oreskes [07:43] -
“By agreeing to debate, the scientist loses before he or she has even opened their mouth... the audience says, oh, there is a debate.”
— Naomi Oreskes [10:48] -
“Reducing your carbon footprint isn't bad, but it shifts agency away from producers.”
— Naomi Oreskes [19:47] -
“Adaptive management is about acting on what we know now but being prepared to adjust in the future.”
— Naomi Oreskes [25:27] -
“Ask, who benefits from what they're saying and what is their interest?”
— Naomi Oreskes [46:48] -
“Scientists need to relate to people on a human level and recognize the importance of listening.”
— Naomi Oreskes [48:56]
This episode serves as a vital exploration of how historical tactics of manufactured doubt are being repurposed by modern tech industries to influence public perception and policy. Naomi Oreskes' expertise offers actionable insights into recognizing and countering these strategies, emphasizing the need for adaptive governance and informed citizenship in an era dominated by rapid technological change.
