Uncanny Valley | WIRED
Episode Title: Iran Strikes in the AI Era; Prediction Markets Ethics; Paramount Beats Netflix
Release Date: March 5, 2026
Hosts: Zoe Schiffer, Brian Barrett, Leah Fikar
Episode Overview
This episode dives into three urgently relevant tech and culture stories: the rapid escalation of military conflict involving Iran and the unprecedented infusion of AI into defense, the ethical quagmire of modern prediction markets (from Middle East geopolitics to Silicon Valley personnel shuffles), and the media industry-shaking Paramount–Warner Bros. mega-merger. The WIRED team analyzes the interplay between tech, policy, business, and global events, with their signature blend of critical insight and newsroom banter.
Key Discussion Points & Insights
1. Iran Strikes in the AI Era
Escalation and Disinformation (04:05–10:03)
- Leah Fikar recaps a harrowing week: after coordinated US-Israeli strikes on Iran, the Supreme Leader is killed, Iran retaliates, and chaos spills across the Gulf.
- Leah and Brian detail a parallel war—disinformation—supercharged by degraded safeguards on X (Twitter). WIRED’s reporting found AI-generated images, video game footage, and blatant location mistakes circulating unchecked.
- Quote (Leah Fikar, 04:40): “I was sort of stunned how quickly disinformation became the center of this conflict… AI generated images to video game scenes being passed off as real footage to countries getting mistaken for each other.”
- Zoe Schiffer contextualizes: X’s hollowed-out moderation team, emphasis on virality, and “community notes” are no match for the velocity of viral misinformation.
- Quote (Zoe Schiffer, 05:59): “This is the culmination of years of product and policy decisions. It’s what happens when you make the platform hostile to journalists… rely on community notes… pay people for traffic...”
- Brian flags the power of blue-check monetization and how even politicians fall for and spread falsehoods, affecting public opinion—especially in an unauthorized war that could easily escalate further (07:06).
- Oil, data, and agriculture repercussions abound. Farmers brace for fertilizer shocks; influencers are already joking about “World War III” on social media, showing just how rapidly the crisis has infiltrated public consciousness.
AI Companies Entangle With the Pentagon (10:03–17:51)
- Zoe highlights the cascading consequences of rushed AI-defense deals: OpenAI and Anthropic negotiate (and resist) Pentagon contracts, with Anthropic asking for guardrails (no surveillance of Americans, no fully autonomous weapons), which the DoD resists.
- Quote (Zoe Schiffer, 10:03): “OpenAI struck a deal with the Department of Defense… Anthropic… wanted a couple of conditions, including a ban on surveillance of American citizens and a ban on using its technology to build fully autonomous weapons. The DoD was not a fan...”
- Sam Altman’s impromptu X AMA (“ask me anything”) addressing backlash is called out as tone-deaf, reinforcing perceptions that OpenAI’s ideals are at odds with its business moves.
- Quote (Zoe Schiffer, 11:28): “Anthropic continues to position itself as the good, the level-headed AI firm. And OpenAI continues to kind of blunder in these moments… it comes out looking a little sloppier and a little less like it has a firm set of values...”
- The panel notes that Anthropic, despite its “principled” stance, is still involved—its products are in use for Pentagon contracts during a “six-month phase out” (12:17–12:38).
- OpenAI’s shifting stance from blanket military-use bans to nuanced contract terms is causing internal strife and outward reputational challenges, fueling a tech talent war.
- On recruitment impact: “There are a ton of researchers… they do not want anything to do with military use… They just don’t want that. And I think that actually matters.” (Zoe Schiffer, 14:30)
- The Pentagon’s power dynamics and entitlement to American-made tech are discussed, with reference to the messy, public battles waged on social media (17:19).
2. Prediction Markets: Ethics, Insider Trading, and Political Ties
Gamification in Times of Crisis (18:34–21:56)
- Brian draws attention to how war and regime change in Iran has become not just a geopolitical crisis but a betting opportunity, with millions staked on Polymarket and Kalshi over events like regime collapse or leader deaths.
- Quote (Brian Barrett, 19:03): “Right now, one of the top bets on polymarket is will the Iran regime fall by June 30? Total bets around $7 million in that market alone.”
- The hosts viscerally push back at the commodification of tragedy.
- Quote (Leah Fikar, 19:20): “That’s so upsetting, Brian. These are people’s lives.”
- Brian Barrett (20:04): “They are invariably betting on whether people will die, just finding cute ways around it and then having a hard time resolving these markets. That’s a problem… it is outrageous.”
- OpenAI has reportedly fired an employee for using internal info in prediction markets; similar cases (like Google Whale) indicate a growing nexus of insider trading risks as market platforms explode in influence (21:56).
- Regulatory response is tepid and inconsistent; only minor enforcement actions have occurred (Kalshi suspensions, 22:26), with the government largely ceding the field to platforms themselves.
Uncomfortable Political Entanglements (22:26–23:48)
- The Trump family’s strong ties to prediction markets—investment, leadership advisory roles, and the impending launch of Truth Predict—are outlined. The risk of policy being swayed by financial stakes (or appearance thereof) is underscored.
- Quote (Leah Fikar, 22:41): “It really reminds me of their investments in crypto world too... They are, I wouldn’t go out on a limb and say like they’re personally gamifying everything and deaths in Iran, but like, they’re benefiting from it...”
- The possibility that people with influence over global events could cash in on their own policies is flagged as a new, shocking ethical threat.
3. Paramount Beats Netflix: Historic Media Merger and the Trump Effect
The Deal, Consolidation and Industry Fear (24:16–26:56)
- Paramount’s $110 billion acquisition of Warner Brothers, beating out Netflix, gives the Ellisons (with deep Trump ties) stewardship over CBS, CNN, HBO, DC Comics, Harry Potter, Star Trek, and a vast cable empire.
- Quote (Brian Barrett, 28:27): “CBS and CNN… HBO, DC Comics, Harry Potter, Star Trek, Looney Tunes, two dozen cable networks that your mom watches.”
- The hosts hear panic from newsrooms about editorial purges and job redundancies as newsrooms overlap.
- Quote (Leah Fikar, 25:10): “People are talking… are the Ellisons going to be canning anyone who’s ever spoken out against Trump on cnn? Jake Tapper, are you out of here, buddy?”
- Bureaucratic and financial hurdles for Netflix included Trump-friendly regulatory expectations and pressure on leadership; ultimately, Netflix walked to avoid a messy political crossfire.
- The team links this story to a broader pattern: Trump-aligned figures gaining ever more control over media and public narrative, with unpredictable effects on journalism and democracy.
4. FutureCast: Predictions for Tech & Society
New Segment Begins at 31:20
Zoe Schiffer: Open Models Threaten Big AI Labs (31:20–33:45)
- Predicts that open-source models—leveraging distilled knowledge from Anthropic/OpenAI systems—could disrupt or “existentially threaten” proprietary AI labs, especially if technical and legal safeguards fail.
- Quote (Zoe Schiffer, 32:41): “Their open models are getting really, really advanced really quickly. And, you know, if you can essentially access CLAUDE without paying Anthropic…”
- Distinguishes between Meta’s failed open strategy (building from scratch) and what’s happening now (copying existing frontier models for a fraction of the cost).
Brian Barrett: A Red-Pilled, Unregulated Prediction Market (33:51–34:40)
- Predicts a notorious offshore prediction market will soon allow bets on violent crimes—including actions by the bettors themselves—with no regulatory recourse.
- Quote (Brian Barrett, 33:51): “I think there will be a well known, highly visible, red pilled prediction market... that will let you bet on violent crimes and let you bet on yourself doing violent crimes. And no one will regulate it because it will be in some random island somewhere...”
- The hosts agree: this is movie script material (34:31).
Leah Fikar: Wartime Politics Will Bolster Trump & the GOP (34:40–36:56)
- Argues that the ongoing Middle East crisis will provide political cover for Trump and Republicans, following a long tradition of incumbents benefiting during war—shifting election dynamics regardless of public opposition to war.
- Quote (Leah Fikar, 36:05): “My future cast is that this is going to be used to keep Trump in power, to keep Republicans in power for a little bit longer.”
- Warns it could serve as a pretext for voting restrictions via national emergency powers.
Memorable Quotes & Moments
- “The blog writes itself.” (Leah Fikar, 05:39) — On endless disinfo cycles.
- "Grotesque. That's the word." (Leah Fikar, 20:41) — On betting millions over the death of Iran's Supreme Leader.
- "We're not talking about people quitting and giving everything up. They've made generational wealth. They can now quit their current job... and then go to another job that will also pay them millions and millions of dollars." (Zoe Schiffer, 15:44) — On the rarefied talent climate in AI.
- “They are, I wouldn’t go out on a limb and say like they’re personally gamifying everything and deaths in Iran, but like, they’re benefiting from it, they’re profiting off of it…” (Leah Fikar, 22:41) — On Trump family’s role in prediction markets.
Useful Timestamps for Key Segments
- 04:05 – Iran conflict escalation recap & the sudden flood of AI-fueled misinformation
- 10:03 – AI industry entangles with Pentagon; OpenAI, Anthropic, and the ethics of "autonomous weapons" deals
- 14:30 – Recruitment and retention crisis at AI labs over defense contracts
- 18:34 – Prediction markets: war as a betting opportunity and the ethics of wagering on real-world tragedies
- 21:56 – Silicon Valley insider trading scandals and corporate enforcement gaps
- 22:41 – Trump family's investments in prediction markets and looming regulatory problems
- 24:16 – Paramount’s takeover of Warner Brothers; media consolidation and newsroom anxiety
- 31:20 – FutureCast predictions: open AI models, prediction markets gone wrong, “wartime rally” for Trump
Tone and Style
The discussion is incisive but conversational, pairing newsroom snark (“the blog writes itself!”) with a sense of foreboding about tech’s real-world impacts. The hosts are skeptical, occasionally exasperated, and not afraid to take political stances—especially on issues of disinformation, war profiteering, and media consolidation.
For Further Reading
Detailed WIRED coverage of these topics, as referenced throughout the episode, is recommended for deeper context—especially for reporting by David Gilbert, Molly Taft, and Kate Nibs.
This summary is structured to help listeners grasp the urgency and scope of the episode’s discussions, highlighting analytical insights, direct quotes, and segment markers for easy navigation.
