Breaking Points with Krystal and Saagar
Episode: September 25, 2025
Title: Iran Warns Of Israeli Attack, Data Centers Spike Electricity Cost, AI Takeover Dire Warning
Hosts: Ryan Grim, Sagar Enjeti
Guest: Nate Soares (Machine Intelligence Research Institute)
Podcast: iHeartPodcasts
Episode Overview
This episode dives deep into three major stories:
- Iran’s escalating tensions with Israel and U.S. foreign policy.
- The impact of AI-driven data centers on rising electricity costs and local economies.
- A dire warning about artificial superintelligence from AI expert Nate Soares.
The hosts dissect urgent global politics, corporate overreach, and technological existential risks, offering both ground-level reporting and high-level analysis.
1. Iran Warns of Imminent Israeli Attack
(Main Segment: 02:36–17:49)
Key Discussion Points
-
Inside Iran’s UN Visit
- Ryan Grim recounts his experiences attending a meeting with the Iranian president, who was in New York for UN Meetings.
- The Iranian president denounces Israel for “genocide, mass starvation, and apartheid” in the occupied territories and criticizes the global community’s inaction.
“The ludicrous and delusional scheme of a Greater Israel is being proclaimed with brazenness by the highest echelons of that regime.”
(Iranian President, 02:53)
-
Iran’s New Leadership and Challenges
- Sagar gives context on Iran’s president: Massoud Possesskian, elected for his moderate reformism and openness to nuclear negotiations.
- The assassination of moderate negotiators like Haniyeh by Israel has fueled hardliner arguments against diplomacy.
-
Fear and Resignation in Iran’s Strategy
- Possesskian’s emotional response to witnessing violence against children in Gaza; sense of being overwhelmed and targeted.
- Iran’s new “chain of succession” protocol:
“If they take me out, there are five to six people to take my place.”
(Sagar Enjeti paraphrasing Iranian President, 05:27)
-
Israel’s Approach to Diplomacy
- Discussion of how Israel often targets moderates, while hardliners are typically left untouched because they fit the narrative of endless conflict.
“Historically…the people arguing for more moderation within those organizations have been more likely to be assassinated by Israel than the hardliners, because the hardliners serve a narrative purpose.”
(Sagar Enjeti, 07:04)
- Discussion of how Israel often targets moderates, while hardliners are typically left untouched because they fit the narrative of endless conflict.
-
Iran’s Disconnection from the U.S. System
- Iranians’ inability to hire lobbyists due to sanctions; they’re left navigating U.S. politics via media coverage.
- Most Iranian officials have scant understanding of American political processes.
“They basically can't hire any lobbyists…They're left to just read the New York Times and watch Fox News or Tucker Carlson to try to figure out how they can maneuver.”
(Sagar Enjeti, 08:24)
-
Prediction of Imminent Attack
- The Iranian leadership expects another Israeli attack soon, fueled by rhetoric from Netanyahu.
“They are basically certain of it and it’s a matter of when.”
(Sagar Enjeti, 09:51)
- The Iranian leadership expects another Israeli attack soon, fueled by rhetoric from Netanyahu.
-
Analysis of the '12 Day War' and Fallout
- After U.S. airstrikes on Iranian nuclear facilities, Iran’s retaliation was deliberately limited.
- The effectiveness of Iran’s missile capabilities was unexpected by Israel and may have influenced the war’s resolution.
“Their ability to get missiles through the Israeli air shield is what actually pushed them to end it.”
(Sagar Enjeti, 11:55)
-
Intelligence Showdowns
- Iran releasing alleged Israeli nuclear program intelligence as a show of force and capability.
Notable Quotes & Memorable Moments
- “Iran is not Gaza. Iran is not Lebanon. Iran is not Syria. Iran is different…this plan you have is like, we're not going anywhere. You have to live with us.”
(Sagar Enjeti, speaking from Iranian perspective, 13:28) - “How are we the terrorists?” (Sagar Enjeti, reflecting Iranian complaint, 14:38)
2. Data Centers and Exploding Electricity Costs
(Main Segment: 19:54–31:37)
Key Discussion Points
-
North Carolina’s Data Center Legislation
- State bill S266 prioritizes data centers over consumers for power allocation and shifts more cost onto residential customers.
“If there is a contest between consumers and data centers over who gets the power…the data centers are going to get it…your electricity bills are going to go up to, you know, subsidize these giant data centers.”
(Sagar Enjeti & Ryan Grim, 23:06–24:36) - The bill was pushed by the former CEO of Duke Energy and overrided by bipartisan support despite the governor’s veto.
- State bill S266 prioritizes data centers over consumers for power allocation and shifts more cost onto residential customers.
-
The Impact on Renewable Energy Jobs
- Clean energy jobs (tens of thousands) in North Carolina, like at Blue Ridge Power, are being eliminated as the state pivots back to favoring big utilities and data centers.
“This layoff of 517 workers…they’re just rolling up shop…because of this new landscape.”
(Sagar Enjeti, 20:19)
- Clean energy jobs (tens of thousands) in North Carolina, like at Blue Ridge Power, are being eliminated as the state pivots back to favoring big utilities and data centers.
-
Data Centers Strain on Local Resources
- Massive water use as AI data centers rise; rural communities witness land and resource takeovers for minimal local benefit.
“AI data centers would need millions of gallons of North Carolina’s water supply a day.”
(Sagar Enjeti, 26:12)
- Massive water use as AI data centers rise; rural communities witness land and resource takeovers for minimal local benefit.
-
Critique of AI-Driven Economic Justification
- The supposed tech advancement boils down to “your electric bill is going up 30% so you can chat with this horse.”
(Sagar Enjeti on Meta’s chatbot, 25:56)
- The supposed tech advancement boils down to “your electric bill is going up 30% so you can chat with this horse.”
-
China’s Renewed Commitment to Green Energy
- Xi Jinping’s UN announcement: “China will not slow down its climate actions…will not cease its efforts to build a community with a shared future for mankind.”
(Ryan Grim paraphrasing, 29:17)
- Xi Jinping’s UN announcement: “China will not slow down its climate actions…will not cease its efforts to build a community with a shared future for mankind.”
-
US Political Dynamics and Renewable Energy
- Trump’s attacks on green energy, job losses in renewables, and the shifting political narrative around energy policy.
Notable Quotes
- “It used to be the lefty green energy people who were supposedly the job-killing haters. And now…we’re in a totally different dynamic…jobs and cheap energy is in the direction of investing in renewable and green energy.”
(Ryan Grim, 30:45) - “We’re maxed out on the dirty stuff. If you want to expand energy production, you have to do it through these new types of energies and Trump doesn't want to do it.”
(Sagar Enjeti, 31:17)
3. AI Takeover: Existential Risk & Dire Warning
(Main Segment: 33:40–59:10)
Key Discussion Points
-
Guest: Nate Soares on 'Superintelligence'
- Title of his new book: If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All.
“If they develop superintelligence…humanity will be destroyed.”
(Ryan Grim summarizing, 34:19)
- Title of his new book: If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All.
-
Why Superintelligence Is Unique and Deadly
-
AIs aren’t “programmed” so much as “grown”—we don’t know what’s going on inside them.
“We’ve already seen signs that they are developing the very beginnings of preferences and drives perhaps that nobody wants, nobody asked for.”
(Nate Soares, 34:38) -
There are no second chances—if superintelligent AI gets it wrong, the consequences are irreversible.
“If we build machines that are smarter than us…and if they run off autonomously and grab control of the world’s resources, there’s no redos.”
(Nate Soares, 35:20)
-
-
Tech Industry’s Dismissive Attitude
- Even leading industry figures admit significant risk; optimism is often based on wishful thinking or unproven coping strategies.
“Even the people who are most optimistic say there’s a 1 in 4 chance this kills everybody…”
(Nate Soares, 35:52)
- Even leading industry figures admit significant risk; optimism is often based on wishful thinking or unproven coping strategies.
-
Opaque & Uncontrollable AI Systems
- Modern deep learning systems are unpredictable and not understood by their creators.
“We’re already seeing signs…that the people who would like to be in charge can’t point these AIs where they would like to point them.”
(Nate Soares, 39:40)
- Modern deep learning systems are unpredictable and not understood by their creators.
-
Dangerous Industry Practices
- Corporations are in a race to scale up AI with minimal oversight—putting AIs online for anyone to use rather than keeping them restricted.
“Now people are putting AIs on the Internet the moment they possibly can.”
(Nate Soares, 45:18)
- Corporations are in a race to scale up AI with minimal oversight—putting AIs online for anyone to use rather than keeping them restricted.
-
Common Objections and Pet Theories
- “Maybe we’ll be AI’s pets”—Soares argues even that is unacceptable risk.
- The primary way humans might die is not direct intent, but as a side effect of AIs automating and replacing our infrastructure.
“We die not because it literally tries to kill us, but because it just starts building automated factories…maybe we just die underfoot.”
(Nate Soares, 47:10)
-
Why AI Alignment Efforts Are Inadequate
- The hope that we’ll “use AI to solve the alignment problem before we build superintelligence” is illogical; the smarter the AI, the less control we have and the harder alignment gets.
-
Industry Acknowledges the Risks—But Keeps Racing
- Industry leaders, including Elon Musk and the head of Anthropic, openly admit to a 10–25% chance of total catastrophe, but continue development in a classic “If I don’t, someone else will” mindset.
“If one engineer was saying, I looked at this plane…I think this plane is 20% likely to crash…You don’t get on the plane.”
(Ryan Grim, 50:39)
- Industry leaders, including Elon Musk and the head of Anthropic, openly admit to a 10–25% chance of total catastrophe, but continue development in a classic “If I don’t, someone else will” mindset.
-
Necessary Solutions
- Soares calls for a global moratorium on racing to superintelligence, not a minor reform or alignment check.
“We need to give up on the race towards smarter than human AI…the race towards superintelligence. This is what the labs say they are racing towards…this is a suicide race.”
(Nate Soares, 54:24) - He believes this is still possible with international coordination or mutual national deterrence.
- Timeline? Hard to predict, but “a child born today has a greater chance of dying from AI than of graduating high school.” (58:53)
- Soares calls for a global moratorium on racing to superintelligence, not a minor reform or alignment check.
Notable Quotes
- “Humanity usually learns from its mistakes…In this situation…the hopeful optimists who are saying everything’s going to be easy, they’re going to screw things up the first time and there’s not going to be any chance to learn.” (Nate Soares, 51:30)
- “My bet is that a child born today has a greater chance of dying from AI than of graduating high school.”
(Nate Soares, 58:53) - “Here’s hoping I’m wrong.”
(Nate Soares, 59:09)
Useful Timestamps
- Iran Segment Begins: 02:36
- Iranian President Clip: 02:53
- Iran Analysis: 04:00–17:10
- Electricity/Data Center Crisis Intro: 19:54
- North Carolina Data Center Bill Context: 23:06–24:36
- AI Segment with Nate Soares: 33:40–59:10
- Dire Risk and Call to Halt Superintelligence: 54:24–59:10
Tone & Final Thoughts
The episode is piercing, urgent, and openly skeptical of establishment narratives—whether state policies, corporate interests, or technocratic optimism. It blends investigative insight, sharp humor, and existential dread, especially in the AI warning. Listeners come away with a layered view: geo-strategy is dangerously unmoored, American policy is often for sale to corporate interests, and runaway technology could genuinely threaten humanity’s survival.
