Podcast Summary: THE WAR ROOM WITH STEPHEN K. BANNON (Ep. 4788, September 17, 2025)
Podcast: Real America’s Voice
Host: Stephen K. Bannon
Date: September 18, 2025
Main Guests: Eliezer Yudkowsky, Nate Soares, Joe Allen, Shemane Nugent, Peter Navarro
Episode Overview
This episode of “The War Room” revolves around the existential risks posed by artificial intelligence, featuring a deep-dive conversation with Eliezer Yudkowsky and Nate Soares — co-authors of the cautionary book If Anyone Builds It, Everyone Dies. The episode also features reflections on the tragic assassination of Charlie Kirk, commentary from Dr. Peter Navarro on political lawfare and his new book, and an interlude with Shemane Nugent on faith, activism, and wellness.
Key Segments & Insights
1. The Existential Risks of AI
(00:31 — 11:31)
Main Discussion:
- Populist Concerns vs. AI Realism
- Stephen Bannon addresses accusations of being "Luddites", emphasizing that warning about AI isn't just a rejection of technology but a call to heed expertise from early innovators.
- Eliezer Yudkowsky and Nate Soares clarify their positions: They support many advanced technologies (e.g., nuclear power, supersonic jets), but hold AI represents a unique, planetary-level risk.
Key Points:
-
Catastrophic Risk Perspective
- “It’s different when a technology risks the lives of everybody on the planet… With AI, we are dealing with much, much higher dangers than that. We’re dealing with a technology that just kills you.” — Nate Soares [01:20]
-
Risk Comparison
- Reference to the low risk thresholds NASA has for spaceflight, contrasting with the far higher (and poorly understood) dangers of advanced AI.
-
Superintelligence and Global Policy
-
International collaboration, possibly akin to a new Geneva treaty, is necessary to address AI, mirroring Cold War-era nuclear arms coordination even between adversarial nations.
“…countries that hated each other [worked] together to prevent global thermonuclear war…That’s what we need to reproduce today.”
— Eliezer Yudkowsky [04:16]
-
-
Clarification on the China Question
- Yudkowsky and Soares are not advocating unilateral US or UK withdrawal from AI development but insist on verified global treaties, noting even Chinese officials sometimes recognize the existential stakes.
The Book’s Message:
- If Anyone Builds It, Everyone Dies signifies that superintelligence, once unleashed, escapes even the control of its creators — it’s not about a tyrant wielding it, but about something fundamentally uncontrollable by humans.
“If you make a superintelligence, you don’t have that superintelligence. You’ve just created an entity that has the planet.”
— Nate Soares [05:49]
2. The Mindset of Silicon Valley & Tech Acceleration
(07:17 — 11:31)
-
Joe Allen observes that warnings about AI are now echoed by leading figures across the tech industry, including both moderate and extreme AI developers.
“Every one of them… Demis Hassabis… Sam Altman, Elon Musk, even Dario Amodei, they all are talking about the creation of artificial general and artificial super intelligence.”
— Joe Allen [07:17] -
Nate Soares describes Silicon Valley’s ethos:
“…I had a choice between being a bystander and being a participant and I preferred being a participant… That mentality… it’s not someone treating with gravity what they’re doing.”
— Nate Soares [09:16] -
The allure of building powerful technology is motivating developers to disregard risks, with some rationalizing “great companies” being built even if AGI ends humanity.
-
Yudkowsky mentions Geoffrey Hinton’s recent activism after leaving Google, paralleling the divergence between those who profit or push ahead and those who reconsider the moral stakes.
3. Decelerationism and the Future Path
(12:10 — 14:05)
-
Host asks both guests whether they identify as “decelerationists”—those who advocate slowing AI progress for safety.
“I would decelerate AI. I would decelerate any technology that could wipe us all out… Every other technology, I think we need to go full steam ahead…”
— Nate Soares [12:44]“If a product kills people on the other side of the planet, that’s everybody’s problem… Artificial intelligence, gain of function, research on viruses—might be another thing… There’s not this one switch that’s set to accel or decel.”
— Eliezer Yudkowsky [13:08]
4. Remembering Charlie Kirk & Community Reflections
(15:28 — 25:47)
-
Shemane Nugent reflects on the impact of Charlie Kirk’s assassination:
“He was the last good guy… so many people are wondering, what do we do?... Silence is not an option at this point. We must move forward…”
— Shemane Nugent [15:28] -
Dr. Peter Navarro extols Kirk’s legacy, placing him above even Ralph Reed and David Axelrod for his groundbreaking work mobilizing youth — considered “the greatest political organizer of the last 50 years.” [19:57]
“…to mobilize the youth in support of MAGA and Trump and MAGA candidates in Congress, he had to first bring him over to our side…he proved me wrong, he proved the world wrong.”
— Peter Navarro [19:57] -
The segment also features personal anecdotes and emphasizes that Kirk’s organizational and persuasive gifts deeply impacted the conservative movement.
5. Lawfare, Political Reprisals & Navarro’s Book
(25:47 — 32:41)
- Peter Navarro introduces his new book I Went to Prison So You Won’t Have To, describing it as both a political warning and a behind-the-scenes look at targeted prosecutions.
“…how the left is going after all of this. If they can come for me, they can come for Steve Bannon… they can do this to you… It’s a story about how we must wake up to what’s happening.”
— Peter Navarro [28:45] - Navarro underscores the “asymmetry” of current political conflict, the stakes of ongoing lawfare, and the urgency of public awareness heading into 2028.
- He notes that family members of those targeted by political prosecution, as in his own case, bear the emotional weight as much or more than the primary targets.
Notable Quotes
-
On Superintelligence:
“AIs may have those effects…but a super intelligence, it kills everybody. Then you’ll have no jobs. You’ll also have full employment, but that’s just the default course this technology takes.”
— Nate Soares [01:20] -
On Global AI Policy:
“…at least work together on not all dying in a fire. And that’s what we need to reproduce today.”
— Eliezer Yudkowsky [04:16] -
On Silicon Valley’s Ethos:
“AI might kill us all, but there’ll be good companies along the way.”
— [Paraphrased statement about Sam Altman, cited by Nate Soares [09:16]] -
On the Loss of Charlie Kirk:
“He will go down in history as the greatest political organizer of the last 50 years…”
— Peter Navarro [19:57] -
On Political Lawfare:
“…they can come for Steve Bannon, put him in prison. If they can try to put Trump in prison now, they shot Charlie Kirk. They can do this to you.”
— Peter Navarro [28:45]
Timestamps & Important Segments
| Timestamp | Segment | |------------|------------------------------------------------------------------| | 00:31–05:38| Existential danger of AI; need for international treaties | | 05:49–11:31| Book title explained, Silicon Valley mindset, Joe Allen’s input | | 12:10–14:05| Decelerationism vs. accelerationism on AI | | 15:28–19:31| Shemane Nugent on Charlie Kirk, loss and faith-based activism | | 19:57–25:47| Peter Navarro memorializes Charlie Kirk; organizing youth | | 28:45–32:41| Navarro discusses his new book and political lawfare |
Final Takeaways
- The episode provides a stark warning from the frontlines of AI research that current global governance structures are wholly unprepared for the existential dangers posed by unrestrained AI development.
- The tragic death of Charlie Kirk looms over the political discussions, serving as both a rallying cry for continued activism and a somber reflection on the costs of political conflict.
- The show weaves together AI existential risks, movement-building, faith, and contemporary political struggles to paint a picture of a society at a crossroads, facing both technological peril and intensified societal divisions.
Recommended Listening:
For listeners wanting further depth, find Joe Allen’s extended interview with Nate Soares and coverage of the recent Senate hearing on AI-induced harms on his platforms (oebotxyz).
