
What does it mean if humans no longer pull the trigger?
Loading summary
Farnoosh Tarabi
Hi, this is Farnoosh Tarabi from so Money with Farnoosh Tarabi and today I want to talk to you about Boost Mobile Quick Money Tip Stop paying a carrier tax if your phone bill feels trapped in a pricey plan, this is your sign to unlock savings. Boost Mobile helps you reset your spending. With the $25 Unlimited Forever Plan. You can bring your own phone, pay $25 and get unlimited wireless forever. And that simple switch can unlock up to $600 in savings a year. That's money you could put towards debt investing or something that actually brings you joy. Those savings are based on average annual single line payment of AT&T Verizon and T Mobile customers, compared to 12 months on the Boost Mobile Unlimited plan as of January 2026. For full offer details, visit boostmobile.com Security
Vanta Representative
and compliance Done wrong is a giant Headache Security and compliance done right? That's Vanta. Vanta helps you earn trust and speed up growth. No spreadsheets required. For startups low on time and resources, Vanta becomes your security hire, using AI and automation to get you compliant fast and unblock big deals for enterprises. Vanta is your AI powered hub for compliance and risk, bringing together data from across your businesses and automating workflows so you can prove trust at any moment. Vanta scales with you at every stage. That's why top companies from startups like Cursor to enterprises like Snowflake choose Vanta Do Security and Compliance Right get started today@vanta.com tedaudio.
Lizzie O'Leary
On Wednesday evening, the news began to
trickle out that an ongoing military investigation found that the United States was responsible for the missile strike on an Iranian school that killed at least 175 people, most of them little girls.
Steve Feldstein
It's one of the worst instances of collateral civilian damage caused by the US Military in the last couple decades.
Lizzie O'Leary
I called up Steve Feldstein, who served in the State Department in the Obama administration and studies technology and national security at the Carnegie Endowment for International Peace.
Steve Feldstein
It reminded me immediately of the strike during NATO engagement in the former Yugoslavia against the Chinese Embassy back in the early 2000s. And so my first question when I heard about this was, well one, was the US responsible for this attack? And then two if so, what series of errors occurred so that a school was targeted and destroyed as opposed to a legitimate military installation? And then third, given the topic of what we are talking about today, was this error a function of artificial intelligence or some other related software program that had gone awry?
Lizzie O'Leary
Do we have any idea whether or not artificial Intelligence was used in the targeting or execution of this strike.
Steve Feldstein
All we know so far is the
reporting that's come out from a few
news outlets, particularly the New York Times. And what has been indicated there so far is that most likely this was a result of human error, a result of outdated information supply by the Defense Intelligence Agency that mislabeled the school as part of Enabled Base, which in fact it was 10 years prior. Based on what I've heard from that, it would indicate to me that this is not an AI issue. In fact, if anything, I would have guessed that AI could have helped in terms of vetting with updated satellite imagery, even public satellite imagery, that the current ongoing usage of this school, as opposed to what appeared to be, you know, grossly outdated information, that this was still something used to conduct military operations by the Iranians.
Lizzie O'Leary
One of the reasons I wanted to
talk to Steve is that he studies
how tech like AI and drones are used in warfare.
I have seen a lot of people
call this conflict in Iran the AI war. Do you think that's a fair assessment?
Steve Feldstein
I think it's a half fair assessment.
And I think that this school strike
exemplifies the gray area that we're in. Because on the one hand, it is true that the tempo of strikes that the US has conducted and is continuing to conduct in this war is only possible due to assistance from artificial intelligence systems, particularly Claude and Maven, which is operated by Palantir.
On the other hand, a lot of
the mistakes and a lot of the sort of ground level decision making still resides with humans. So to simply call it an automated
or an AI war I think leaps
ahead a little bit too far. I think it's somewhere in this transition gray period where humans still hold a
lot of accountability and oversight.
But there is a growing reliance on technology and AI systems to power and enable the targeting for this war.
Lizzie O'Leary
So maybe better to call it a preview of the future of war.
Steve Feldstein
I think that's right. I think every subsequent war we see, certainly those conducted by the US will see increasing amounts of integration of AI. I think we're just at the front end of where this goes today on
Lizzie O'Leary
the show the Iran War and the future of Automated Combat. I'm Lizzie o' Leary and you're listening to what Next? Tbd, a show about technology, power, and how the future will be determined. And hey, if you are listening to us on Spotify, especially if you are listening on a playlist, do me a favor and make sure you follow the what Next feed. Spotify is getting rid of some playlists and we want to make sure that you stick around. This episode is brought to you by Duck AI, a new product from DuckDuckGo. It's 2026 and the news is full of AI. It can be a little overwhelming, especially because it's easy to wonder if what you are asking a chatbot is actually secure. That's why DuckDuckGo built Duck AI. Duck AI is designed to let you chat privately with the same AIs you might already be using. Plus it's complet free.
No signups, no subscriptions.
Duck AI promises to protect your info from hackers, scammers and data hungry companies. And it's from DuckDuckGo, the company that's been protecting your privacy, not collecting your Data for nearly two decades. If you want to use AI but you're concerned about privacy, visit Duck AI TBD today. That's Duck AI TBD from DuckDuckGo, where AI is always optional and designed to be private.
Tommy John Advertiser
Guys, it's no use putting it off. The best time for an underwear refresh is now. Tommy John Underwear is designed for a perfect fit that stays put all day. There's zero chafe, thanks to four times more stretch than competing brands and their innovative horizontal Quick Draw Fly is a game changer. With over 30 million pairs sold, there are thousands of men out there more comfortable than you. Don't settle for less. Go to tommyjohn.com today for 30% off site wide with code comfort. That's tommyjohn.comfort. tommyJohn comfort perfected.
Monday.com Advertiser
Close your eyes. Listen to Monday.com feel the sensation of an AI work platform so flexible and intuitive it feels like it was built just for you. Now open your eyes, go to Monday.comstart for free and finally, breathe.
Lizzie O'Leary
Technologically speaking, the war in Iran builds on a lot of lessons learned from the war in Ukraine, both in terms of AI and drones. I asked Steve to start with drones and walk me through the universe of what we're talking about when we say drone warfare.
Steve Feldstein
There's a whole range of different drones and essentially a drone is an unmanned aircraft. They even now there are also unmanned submarines, there's unmanned ground vehicles. But let's just focus on the aircraft
side for, for a second.
And then within this range, I mean what we saw in prior years under the Obama administration earlier when we're looking at strikes in Somalia against Al Shabaab or so forth, were largely high altitude or medium altitude long range unmanned vehicles
or aircraft like predators or Reapers.
They were expensive. They had very expensive Munitions on them, laser guided missiles and so forth. And only a handful of countries were, were able to acquire them. What we've then seen is sort of
a democratization of this technology.
So one of the leading actors in
this space in the last five years has been Turkey.
And so they have put out a
version of a drone called the TB2
which is similar to the Reaper drone. It is also medium altitude. It also comes with laser guide munitions. But it's cheaper, costs about a million for 1,10 million for a system. So it's still expensive. But many more countries have them. Countries like the Sudanese Air Force, for example, Nigeria. I mean I've done some research on this. You know you could look at something like 30, 40 different countries have access to TB2s. So that's sort of another level. Now we're seeing the dumbing down even further of drones. And so what we're looking at when it comes to these one, one time
use drones one way, use drones like
the Shahed, which has become the Duran.
Drones in Russia are smaller, cheaper and mass produced.
So in Iran the Shahed drones are something maybe like 80,000 or so stockpiled.
Lizzie O'Leary
How big are we talking about here?
Steve Feldstein
I mean you can fit in a truck bed. Now look, there are even smaller drones than that that a lot of people are talking about that are on the
battle lines in Ukraine.
And those people refer to those as first person view drones. Those are sort of like they came from essentially repurposed consumer hobbyist drones that are fitted with explosives, where you put on goggles, you look through them, those cost anywhere between 1 and $3,000 and they surveil and they patrol the front lines. And those are like quadcopters, right? So that's the very lowest level tactical, cheapest drones. And those are prevalent everywhere. Something like 4.5 or 5 million were manufactured and produced by Ukraine last year. So that's your kind of like level,
your hierarchy of drones.
Lizzie O'Leary
This is the first conflict where the US is using these one way attack drones in combat. People love to talk about it. But from your point of view, how significant is that?
Steve Feldstein
I think it's a really interesting development and I think it is for, for a couple of reasons. I think one, it shows you the kind of process of diffusion goes backwards and forwards, up and down the scale.
Lizzie O'Leary
Because they're derived from Iranian drones originally, right?
Steve Feldstein
That's right.
I mean apparently it was an Iranian drone that was captured on the Ukrainian battlefield that made its way over to the US that was in reverse engineered and they used I think A few different technological platforms. I think they've also based it a bit on a loitering drone from Israel as well. But you know, essentially the idea and the concept came from a smaller military, a less advanced military, and has worked its way up. And the US Military said, wait a second, this is something that is cheap, cost effective, and we need to adopt it because our missiles are too expensive, our drones are too expensive, and if we're going to fight a long war, how are we going to match these cheap capabilities of our adversary? So it's a really interesting kind of like reversal of how innovation has typically been thought of.
Lizzie O'Leary
Where does precision fit into that? Because obviously one of the kind of 2013, 2014 and before arguments was drones
are more precise, that there would be
fewer American casualties, that there would be fewer civilian casualties. And if you look in the sort of sweep of conflicts, Right, that is true vis a vis the bombing of Dresden, let's say, or you know, the blitz in London, that is definitely true. But I wonder when you look at these kinds of munitions now, how much
more precise are they and what does
AI do to the precision question?
Steve Feldstein
Well, I think they're definitely more precise. I think there's no question about that. I think one of the questions that you do have to ask yourself though is how does volume change things? So even if you have 99% precision and you're shooting 100 missiles versus 99.5% precision and you're shooting 1,000 or 10,000 missiles, obviously there's a delta there that you're never gonna completely fill. And that delta will result in really bad things, collateral damage, civilian harm. And so because you're shooting so many more missiles and using so many more drones, even if they are more precise, you still can't get around this problem of civilian casualties and other types of problems. So I think that's one aspect that's important to bear in mind. I think the second aspect too is that as we mentioned, precision is a very blanket term, but it doesn't mean the same thing from different types of weapons. And a precise cruise missile or a precise laser guided munition is very different from a GPS drone. One that can be spoofed, that can be fooled, that isn't fail safe, that can hit the wrong target and oftentimes does. And frankly for many of the operators, particularly when you look at countries like Iran and Russia, which blatantly and consistently violate international humanitarian law, that's a very small consideration. Iran didn't make design shaheds with the idea that how do we create something that will both conform to the laws of armed conflict and also win us a war? They said we need to win a war. We need to do really bad things to our adversaries or frankly, to their own people, which, you know, we see anyway. And so, rules be damned, we're going to just put this technology ahead. Now, the US Military operates in a very different way, and they are bound by the laws of armed conflict, unlike the Russians, unlike the Iranians. But, you know, you can sort of see where using those designs can lead to difficult questions and problems on the battlefield.
Lizzie O'Leary
Though we should point out that the Trump administration did scrap a program that was meant to limit civilian casualties.
Steve Feldstein
Yes. They, Under Secretary Hegseth, they have closed or are phasing out two different centers that evaluate and assess civilian harm and civilian casualties. That's correct. They've done a number of other things as well when it comes to accountability, from firing the leadership of the JAG Corps to even slashing the operational testing unit that evaluates the efficacy of new weapons. These are all things that people have looked at from the outside, myself included, and said, well, wait a second, how committed is the Pentagon to following accountability if it's willing to sort of take away the structures that had been put in place to ensure those safeguards are met? So that's a fair point.
Lizzie O'Leary
Secretary Hegseth has talked a lot about AI.
How much do we know about AI
use currently in targeting and analysis?
Steve Feldstein
Well, we know a lot. Anecdotally, I'm looking at this from the outside, and I'm looking at this in comparison to what we've learned from other military's use, particularly Israel's and Ukraine's. And, you know, some of the things that have been reported on, I think, are becoming increasingly clear about how they're used. So, for example, you know, there's a Maven platform that's integrated and run by Palantir. Within that, we see Claude Anthropic's AI model used to help process through reams of data that's collected. Now, what we're talking about is everything from signals intelligence, cell phone calls, text messages, social media posts, satellite imagery, and you stream all this information together, and so you have like a gigantic database and pool of relevant information. And so in the past, you know, you sort of look at that and say, well, what do I do with that? How do I derive insights? How do I figure out a pattern for that so I can identify where the most salient targets are, whether it's individuals or military assets or something else. And so you know, what Claude has been able to do when it comes to AI is help look through millions of bytes of data and try to derive patterns. I mean, you put in a query or a search, I'm looking for this type of person. And then, you know, within a very quick amount of time, it's able to generate with a reasonable degree of confidence those suspected whereabouts. I mean, depending on what you're looking for.
Lizzie O'Leary
After the break, Anthropic and the Pentagon played a high profile game of chicken. Except Claude is already a part of this war.
Steve Feldstein
Foreign.
Farnoosh Tarabi
Hi, this is Farnoosh Tarabi from Sew Money with Farnoosh Tarabi. And today I want to talk to you about Boost Mobile. Quick Money tip. Stop paying a carrier tax. If your phone bill feels trapped in a pricey plan, this is your sign to unlock savings. Boost Mobile helps you reset your spending. With the $25 Unlimited Forever plan, you can bring your own phone, pay $25, and get unlimited wireless forever. And that simple Switch can unlock up to $600 in savings a year. That's money you could put towards paying down debt, investing, or something that actually brings you joy. Those savings are based on average annual single line payment of AT&T, Verizon and T Mobile customers, compared to 12 months on the Boost Mobile Unlimited plan as of January 2026. For full offer details, visit boostmobile.com hey,
Paige from Giggly Squad
this is Paige from Giggly Squad. We all have way too many subscriptions and bills and no good way to manage or track all of them. But now we have Experian. It's the best place to manage your finances because you can connect all of your accounts in one place, track all your spending, and you can let Experian do the work of finding ways to save you money. January is the perfect time to get your finances in order. It's the perfect New Year resolution. Let your big financial friend Experian do the work for you. So get started today with the Experian app. Now.
Lizzie O'Leary
Well, so we got to talk about the fight between Anthropic and the Pentagon. Anthropic has now sued the Pentagon over its designation as a supply chain risk. And yet at the same time, we know that Anthropic tools are being used in in this conflict. And I wonder what this fight shows you about how the Pentagon, at least in this administration, envisions AI working for them.
Steve Feldstein
Well, I mean, I think their vision is that they call the shots, the companies provide the product, and then after that, the companies have little to no say.
Lizzie O'Leary
You're our contractor, you do what we say.
Steve Feldstein
Exactly.
And if there are problems that emerge, that's on the Pentagon to figure out and determine what happens. But it's not on companies to be able to impose safeguards and to any degree about how their products are used.
Lizzie O'Leary
I mean, I think what is also interesting here is now OpenAI has stepped in and said, like, oh, well, the DoD kind of agreed to these conditions that Anthropic wanted. They're similar. Ours are there. We'll go with any legal use. What do you make of that?
Steve Feldstein
You know, there's sort of a couple things that come to mind very quickly on the Anthropic thing. I want to also just sort of note that there was initially a question Anthropic had when it came to a very limited Venezuelan mission and whether CLAUDE
was used for that.
Now all of a sudden, we're in the Iran war, where it is very clear that the use of CLAUDE to at least support lethal operations is prevalent. It's not just a single operation. It's not just something with a limited scope. It's everywhere.
Lizzie O'Leary
This is knitted into their system.
Steve Feldstein
Right. I mean, so we're far beyond what that initial concern was. So we're in a whole different world. And that happened essentially over the course of two weeks. So I think that's just one thing that's worth noting. In some ways, we've already moved beyond the scope of the initial anthropic Pentagon dispute. CLAUDE is there. It is helping with AI targeting and is leading to lethal results.
Lizzie O'Leary
And we're not even talking about the domestic surveillance question. We can leave that for a whole other episode.
Steve Feldstein
Exactly.
And then, you know, when it comes to OpenAI, I mean, we still reach the same question, which is, you know, what does all lawful use mean? Right now, there are only a few directives in place that sort of help define the scope of, you know, appropriate human oversight when it comes to AI enabled weapons. First of all, that's a loophole. So that can mean almost anything.
Lizzie O'Leary
And traditionally, national security loopholes are enormous.
Steve Feldstein
Exactly.
And then second of all, a DoD directive is not like legislation. It's an internal administrative regulation. So it can change. It's subject to change based on the discretion of the Secretary.
Lizzie O'Leary
Does that mean that the DoD's ultimate goal is to have a fully autonomous weapon without a human in the loop? You know, that's the thing that keeps being one of the big talking points,
but I don't feel like I have a lot of clarity on whether that is where this is headed.
Steve Feldstein
Potentially Hypothetically, it is headed in that direction.
Lizzie O'Leary
How quickly?
Steve Feldstein
That's a great question. I don't know the answer to that. I mean, maybe five years, maybe less. I mean, I think there still is a really big leap from where we are right now with the semi autonomous weapons that we have in place to something that's fully autonomous in the kind of like formal sense of the word. Something that can undertake independent decision making on the battlefield with minimal human input. We're not there yet. That is a scary prospect and Rubicon to cross. But as you see the engineering improve, as you see AI's models become more powerful, and we're just seeing this everywhere, it's not even with war. It's just overall we're seeing kind of increasing abilities of a models to do things that we thought previously were unimaginable. You know, there's, there's no reason to think that we're not going to get to full autonomy at some point, certainly early next decade.
Lizzie O'Leary
One of the things about drone warfare that has been discussed quite a bit and when we add in the layer of AI that I would say creates even more distance, is that screen separation, right? Like a real separation between the operator and whatever is on the other end of that action. Humans, buildings, you know, cities decimated.
The increasing tech mediation of warfare does
make me wonder, like, how much of this is being so technologically mediated that
it loses that humanity.
Steve Feldstein
It's a fair question. When you're looking through a screen and fighting war remotely in a remove from the battle lines, it certainly changes your perception of what you're doing. I mean, that being said, the studies that I've seen, when it comes to the effect, the battlefield effect on drone operators shows a pretty clear amount of trauma that operators accrue from shooting weapons and engaging in warfare. That may not be exactly equivalent to what someone is doing kinetically in the trenches, but is pretty profound. So it's not something where I think humans, as long as they're in a role where they're guiding drones, are removed completely from the ravages and effects of war. Psychologically, there seems to be a lot of damage and a lot of harm that over time many soldiers end up accruing from undertaking these tasks. But I think it's a question that's worth watching, especially as you and I have talked about, as you see, greater level of AI and autonomy incorporated. How does that change things? How does that change the calculus?
Lizzie O'Leary
One of the new features of this
war is the emergence of AI created dashboards, online tools that civilians make to watch the war in something like real time. They have maps, video chat features, even betting markets. At the same time, the White House is also releasing propaganda videos that cut together video of real military strikes with movie clips. It's a layer of gamification that could not exist without AI.
Steve Feldstein
I mean, it is really wild to see that, and pretty jarring, I would say. You know, I've seen some of the videos coming out. In fact, just this morning, I was looking at a video that came out from the White House that sort of interspersed clips of Braveheart in other movies with live targeting of, I think maybe
Iranian mining vehicles or other naval ships.
And so you are seeing this kind of like, confluence of the artificial, the real entertainment, gamification. And look, I mean, I think most of us, many of us are really disturbed by that. I mean, if nothing else, it's sort of undercutting the seriousness and the kind of moral valence of war and turning it into something that is entertainment that is meant to generate clicks and eyeballs. And ultimately what we're talking about is
death and destruction and.
And all sorts of horrible things that emerge. I think it's important for us not to forget that lest we continue engaging in more and more wars, which unfortunately, we seem, as a country to be doing with greater frequency. I mean, that's the whole issue. Right.
Lizzie O'Leary
And that seems to be the thing about both AI and drones, is that I hear you saying this repeatedly and I've heard other people saying this.
It makes it easier to fight just
a longer war, one that just goes on and on.
Steve Feldstein
Yeah, especially when you're able to do it at a remove and where the cost to your own military, at least in terms of casualties, is comparatively low. I do think that it's instructive to look at what happens. I mean, there's many different types of war we're fighting, essentially a remote air war right now, which are designed to insulate your army, at least the one that is engaging it, not Iran's army, but the US military and the Israeli military, from a large number of casualties. But if you just flip across and look at the ravages of war as it relates to Ukraine and Russia, and the number of people who are wounded or dead in the hundreds of thousands, and just how much of a hellscape the actual battle lines are that are patrolled by drones, where you have small groups of soldiers who are trying to infiltrate from one side to another, who are hunted down by machines and who are shot, who have 20 seconds or so to find cover before an Exploding munition heads towards them. It is some of the worst stuff I've ever encountered. And so, you know, I think air wars, definitionally, inherently sort of insulate the public from seeing that.
Lizzie O'Leary
Though this war is very unpopular, it's
Steve Feldstein
very unpopular because no one can really see a clear argument or line of reasoning for why we're fighting it. And it's leading to many sort of spillover economic effects as well, especially higher oil prices and other sorts of nasty things.
Lizzie O'Leary
I want to end on some sort of philosophical questions. One of the comparisons that people use when talking about autonomous weaponry is autonomous
cars, that they are safer
and that, you know, people are a risk and you're taking people out of the decision chain. And I guess I just don't. I have trouble with that. But I'm also curious from what you
have studied, like, should we want a fully autonomous weapon?
Steve Feldstein
You know, I find that problematic. I think the more that we remove human decision making from something so inherently critical, I think the more that leads to lots of hazards down the line. I think it's important that we retain human accountability at the end of the day, that we don't delegate or outsource one of the most fundamental aspects to what we do as humans to a machine. If you're going to engage in a
lethal operation, at the very least you
ought to know what you're doing and have a human say, I signed off on that, and it follows the laws of armed conflict, and here's why. But to say, well, I sent it off to the machine and I don't know if the data that it used for targeting is good or not. And I don't have a real way to scrutinize that black box. And in the fog of war, the machines are going so quickly that I can't really monitor that anyway. So I'm going to trust that the data shows that generally these weapons are pretty good. Like that, to me, is a nightmare scenario.
Lizzie O'Leary
Steve Feldstein, thank you so much for talking with me.
Steve Feldstein
Thanks for having me.
Lizzie O'Leary
Steve Feldstein is a senior fellow at the Carnegie Endowment for International Peace in the Democracy, Conflict and Governance Program. All right, that is it for our show today. What Next? TBD is produced by Patrick Ford. Our show is edited by Evan Campbell. Paige Osborne is the senior supervising producer for what Next and what Next tbd. Mia Lobel is the executive producer here at Slate. And TBD is part of the larger what Next Family. We will be back on Sunday. I'm Lizzie o'. Leary. Thanks for listening.
American Red Cross Representative
Blood donation is now more inclusive. More people are able to donate blood with the American Red Cross through FDA guidelines that eliminate eligibility questions based on sexual orientation. The Red Cross celebrates this historic change and welcomes those who may be newly eligible to donate blood. There's a place for everyone in the mission of the Red Cross. The Red Cross is committed to achieving an inclusive blood donation process that treats all potential donors with equality and respect while maintaining the safety of the blood supply. Join us and help save lives. To learn more and make your appointment to donate blood, visit redcrossblood.org LGBTQ that's redcrossblood.org LGBTQ over 90 of the top 100 US accounting firms trust Bill to handle bill pay processes. Why? Because our tools are built on over a trillion dollars of secure payments. We're not just moving money, we're powering financial workflows for half a million customers. That's a level of expertise you just can't fake. Ready to talk with an expert? Visit bill.comproven to get started and grab a $250 gift card as a thank you. Terms and conditions apply. See Offer page for details.
This episode dives deep into how artificial intelligence (AI) and rapidly evolving drone technology are transforming modern warfare, with a focus on the U.S. involvement in the Iran conflict. Host Lizzie O’Leary interviews national security and technology expert Steve Feldstein to explore recent events—including a tragic U.S. missile strike—and how new technological capabilities, accountability, and the laws of war are struggling to keep pace with the realities of “AI wars.”
On Outdated Intelligence and Missed Opportunities for AI:
“If anything, I would have guessed that AI could have helped in terms of vetting with updated satellite imagery … Grossly outdated information, that this was still something used to conduct military operations by the Iranians.”
— Steve Feldstein ([03:14])
On Gamification of War:
“We are seeing this confluence of the artificial, the real, entertainment, gamification. … It’s sort of undercutting the seriousness and moral valence of war and turning it into entertainment.”
— Steve Feldstein ([25:54])
On the Dangers of Outsourcing Lethal Decisions:
“To say, well, I sent it off to the machine and I don’t know if the data that it used for targeting is good or not … That, to me, is a nightmare scenario.”
— Steve Feldstein ([29:37])
| Topic | Timestamp | |------------------------------------------------------------------------------------------------|------------| | Overview of school strike & AI speculation | 01:43–03:14| | Are we in an “AI war?” | 04:10–05:07| | Drones: tech evolution and global proliferation | 08:14–10:49| | U.S. adoption of low-cost, “one-way” drones | 11:03–12:04| | Precision, civilian casualties, ethical divides | 12:13–14:37| | Pentagon accountability and civilian harm programs | 14:37–15:26| | AI/Palantir/Claude in targeting | 15:30–17:10| | Anthropic vs. Pentagon legal clash, regulation loopholes | 18:51–22:07| | Is a fully autonomous weapon inevitable? | 22:07–23:02| | Psychological effects on drone operators | 23:29–24:59| | AI-powered civilian dashboards & gamification | 25:01–26:22| | AI/drones making longer wars easier | 26:40–28:06| | Should we want autonomous weapons? | 28:26–30:14|
The episode underscores the uneasy and rapidly shifting ground at the intersection of warfare, technology, and accountability. AI and drones are enabling new forms of warfare—more data-driven, distant, and potentially endless—that challenge old ethical assumptions and human oversight. While AI offers potentially life-saving precision and intelligence, the risks of dehumanization, legal ambiguity, and technological “distance” from the battlefield grow every year. As Feldstein warns, the drive toward full autonomy in lethal force brings us to a “nightmare scenario” unless we retain strong human accountability.