Intelligence Squared Podcast Summary
Episode: Is AI About to Automate War? With Anthony King
Date: September 30, 2025
Host: Adam McCauley
Guest: Professor Antony King (Professor of War Studies & Director, Strategy and Security Institute, University of Exeter)
Overview:
This episode explores the realities and myths around artificial intelligence in military settings, guided by Professor Antony King’s new book, AI Automation and War: The Rise of the Military Tech Complex. King argues against the prevalent science fiction narrative that AI will imminently replace human strategic judgment in war. Instead, he offers a nuanced, evidence-based examination of how AI is actually being used by the military, emphasizing the persistent, indispensable role of human agency, strategic judgment, and organizational dynamics.
Key Discussion Points & Insights
1. Defining AI and Its Present Capabilities
[04:24–05:55]
- King’s Definition:
- AI as a broad field aiming to process vast datasets to produce useful, sometimes unexpected outputs.
- King's preference is a “functional, commonsensical” definition focusing on data processing and problem-solving, rather than on aping human-like intelligence.
Notable Quote:
"It exceeds the actual initial inputting and initial human programming to produce results that are actively useful and unexpected... I prefer a definition that focuses more on the data processing side."
— Professor Antony King [05:24]
2. Debunking the Automation Hype
[07:01–10:15]
- Prevailing Narrative: There's a widely held belief, especially following exponential AI progress, that AI will soon achieve “general” or even “superintelligence,” automating not just data-processing but high-level strategy in war.
- King’s Refutation:
- The idea that warfare and strategy are reducible to algorithmic or inductive reasoning is deeply flawed.
- Current AI is powerful at inductive, correlation-based tasks with suitable data but falls dramatically short of making the kinds of moral, ethical, interpretative, and strategic decisions central to war.
Notable Quote:
"My central argument was, look, AI is already a very significant factor in strategy and warfare, but it's miles away from the vision of automating war, war and strategy."
— Professor Antony King [09:55]
3. The Limits of Inductive AI in Warfare
[11:49–17:18]
- AI’s Strength:
- Excels in domains with structured data (planning logistics, targeting with good intelligence, cyber ops).
- Inherent Constraints:
- Warfare and strategy often deal with poorly-structured, ambiguous, or novel information.
- Many critical military judgments — defining mission objectives, ethical deliberations — simply cannot be adequately addressed by inductive statistical systems.
Notable Quote:
"The notion that strategy and war is reducible to pure statistical probability, it's simply false... There are constitutive actions of war itself that are definitional, and not something a second-generation, even the most sophisticated, generative AI model can give to a human group."
— Professor Antony King [13:41]
4. Real-World Military Applications of AI
[17:18–26:32]
A. British Army Route Planning ("Military Waze")
- Simple machine-learning application to automate route planning — a previously slow, manual staff task.
- Key ingredients: bounded problem, ample quality data, and clear solution criteria.
- Resulted in rapid, repeatable logistical planning — but not broader decision-making.
B. AI Targeting in Conflict
- Gaza: Israel’s "Lavender" system used big-data analysis to generate targeting lists (e.g., 37,000 Hamas targets), relying on aggregation of multiple data sources.
- Ukraine: U.S. and allies using similar systems for live, real-time targeting and intelligence.
- Critical Point:
- AI provides actionable information, but humans in command set the objectives, rules of engagement, and thresholds for action.
Notable Quote:
"Lavender had not defined the Gaza War, hadn't defined the operation. Netanyahu, his cabinet, senior generals in the IDF defined what that operation was. Lavender came in as an effective targeting system to generate targets for them."
— Professor Antony King [25:47]
5. The Organizational Revolution: Military-Tech Complex
[26:32–35:06]
- Deep Partnership Forming:
- Modern militaries lack the programming talent, data infrastructure, and computing power; they depend on tech giants and defense tech companies for AI innovation and maintenance (e.g., Palantir in Ukraine, Starlink by Elon Musk).
- Historical Shift:
- Not just buying weapons; now requires true co-development and ongoing integration with private-sector expertise.
- Raises new political, operational, and organizational challenges.
Notable Quote:
"A fusion of the military and the tech, a deep partnership... goes into military operations themselves ... It's highly unusual. It's novel historically... it's utterly fascinating, but also very important and actually potentially quite dangerous politically."
— Professor Antony King [31:26]
6. Talent, Market Forces, and Strategic Vulnerabilities
[33:42–40:23]
- Concentration in Silicon Valley:
- AI expertise, compute power, data, and talent are monopolized by a handful of U.S. tech primes and specialized defense startups.
- Public institutions and militaries can't retain top talent; the private sector dominates.
- Unlikely Reversal:
- Rather than fragmenting or democratizing AI capability, consolidation is accelerating.
- This organizational asymmetry could have significant national and international security implications, contrasting sharply with China’s state-directed model.
Notable Quote:
"I really don't see a rebalancing of the sector. On the contrary, I think that the monopolies that Silicon Valley have had since the digital revolution... I just don't see how you can, how that is going to be reversed."
— Professor Antony King [37:12]
7. The Geopolitical Dimension: U.S. and China’s Competing Civil-Military Models
[40:23–48:10]
- Silicon Valley’s Shifting Identity:
- Historically libertarian and globally oriented, big tech has strategically realigned with U.S. national interests since 2018, especially as tensions with China escalated.
- Now, the Silicon Valley–Pentagon alliance mirrors a more national security–driven approach, distinct from China's state-directed civil-military fusion.
- Competitive Dynamics:
- U.S. model may foster more innovation and flexibility; China benefits from greater organizational control.
Notable Quote:
"What I'd suggest is ... Silicon Valley has aligned itself very closely with US national interests...and that will continue and deepen."
— Professor Antony King [44:07]
8. The Hazards of "Datafication" and Automation Bias
[48:10–55:55]
- Data Architectures as Lenses:
- Systems like Palantir craft the very maps through which militaries “see” and act on the world.
- This may enhance clarity but can also flatten complexity, encourage over-confidence, and produce “operational bubbles.”
- Enduring Human Judgment:
- No matter how curated or sophisticated the data, crucial interpretative, imaginative, and strategic decisions remain irreducible to algorithms.
- Automation Bias:
- Crisis decision-making carries the risk that AI-generated outputs will be over-trusted, turning expert recommendations into de facto decisions.
Notable Quotes:
"Are the maps they're making accurate or are they just illusions on which the basis of terrible decisions will be made?"
— Professor Antony King [50:33]
"Even the best AI programs will not have obvious answers to everything. Their application will be much tighter...Many AI programs, I would equate them with a map from the early modern period."
— Professor Antony King [53:25]
9. AI’s Cultural and Moral Impact on Military Institutions
[55:55–64:01]
- Technology as “Congealed Labor”:
- New tech embeds the labor and expertise of many humans, enabling new capabilities but not wholly determining organizational behavior or culture.
- Military Culture and the Enemy’s Role:
- Adversarial environments ensure technology always enables more questions and complexity rather than supplanting human decision-making.
- Ethical and operational decisions persist as uniquely “human,” regardless of how data-driven or technologically mediated the environment becomes.
Notable Quote:
"Every piece of technology we get right now, we can do something faster, better, quicker, right? What other 10 things can we now do? And if you look at the history of military history, that is precisely what happens."
— Professor Antony King [60:10]
10. Final Reflection
[64:01–64:49]
- The enduring insight is that even “the most technologically advanced conflict at our present moment” has enabled nothing more than the same “slow, begrudging type of combat” as before. Adversaries continue to match tools and tempo, making human command and adaptation central, not obsolete.
Timestamps for Key Segments
- AI Definition & Hype vs. Reality: [04:24–10:15]
- Technical Limits & Capabilities in Warfare: [11:49–17:18]
- British Army & Gaza/Ukraine Case Studies: [18:22–26:32]
- Military-Tech Organizational Shift: [27:59–35:06]
- Market/Talent Dynamics & Strategic Implications: [33:42–40:23]
- Geopolitical Reorientation & China–U.S. Comparison: [40:23–48:10]
- Automation Bias & Datafication Risks: [48:10–55:55]
- AI’s Cultural/Moral Impact on the Military: [55:55–64:01]
Memorable Moments & Quotes
-
“It's miles away from the vision of automating war, war and strategy.”
— Prof. Antony King [09:55] -
“Lavender had not defined the Gaza War... Netanyahu, his cabinet, senior generals in the IDF defined what that operation was. Lavender came in as an effective targeting system to generate targets for them.”
— Prof. Antony King [25:47] -
“A fusion of the military and the tech, a deep partnership...which is highly unusual. It's novel historically...but also very important and actually potentially quite dangerous politically.”
— Prof. Antony King [31:26] -
“Are the maps they're making accurate or are they just illusions on which the basis of terrible decisions will be made?”
— Prof. Antony King [50:33] -
“Every piece of technology we get right now, we can do something faster, better, quicker, right? What other 10 things can we now do?”
— Prof. Antony King [60:10]
Takeaway
King’s argument is a timely, sobering recalibration of the role of AI in war. Rather than automation of strategy, what we are witnessing is a profound transformation of organizations, operational processes, and civil-military relations. Far from irrelevant, human agency, ethical deliberation, and institutional cultures remain at the heart of military affairs in the age of AI — and may be more vital than ever.
Book:
AI Automation and War: The Rise of the Military Tech Complex — Professor Antony King
