The AI Policy Podcast – Episode Summary
Episode: “Is China Done with Nvidia’s AI Chips?”
Host: Center for Strategic and International Studies
Date: September 24, 2025
Featured Guest: Gregory C. Allen
Topics: AI as the new “Manhattan Project,” China’s Nvidia chip ban, Anthropic’s copyright settlement, massive investments by chipmakers
Episode Overview
This episode delves into the explosive growth of AI investment, explores China’s evolving stance on Nvidia AI chips, examines a landmark copyright settlement affecting AI companies, and discusses high-stakes strategic investments shaking up the global semiconductor industry. Gregory C. Allen, a senior adviser at CSIS, provides a deep-dive analysis into these issues, emphasizing their implications for AI policy, competition, and global geopolitics.
Key Discussion Points and Insights
1. AI: The Manhattan Project of Our Time
- Theme: AI development spending rivals and exceeds the scale of historical government megaprojects.
- Motivation: A recent tweet likened OpenAI’s planned $20 billion 2025 training spend to the WWII Manhattan Project.
- Breakdown (02:00–07:00):
- The real Manhattan Project (1940s) cost $2B (~$27B today).
- Quote [03:26]:
“OpenAI's expected training costs for next year are in the same ballpark as the inflation adjusted cost of the Manhattan Project. In dollar terms, that's crazy already, just as a starting point.” – Gregory C. Allen - The combined annual capex for the top five US AI/cloud firms in 2025 will hit $350B—exceeding the Manhattan Project even as a share of GDP over a four-year horizon.
- Private sector investment dwarfs previous government efforts, driving both stock market returns and transformation of infrastructure.
2. The Race for AI Compute: Facility Scale and Superclusters
- Focus: Explosive build-out of massive data centers, typified by Elon Musk’s XAI and the “Colossus” series.
- Key Metrics (05:02–13:19):
- Historical perspective: OpenAI’s GPT-3 trained with 10,000 Nvidia V100 chips; GPT-3.5 used ~20,000 A100s.
- Colossus 1 (XAI, 2024–2025): Started at 100,000 H100s, upgraded to 200,000+ H100/B100/B200s.
- Each chip generation leapfrogs performance (e.g., H100 is 32x more powerful than V100).
- Quote [09:38]:
“We’re talking like hundreds and hundreds and hundreds of times more aggregate computing power coming up on like a thousand times more aggregate computing power...” – Gregory C. Allen - Colossus 2 (2025):
- Built on 1 million sq ft, with 200MW cooling = up to 110,000 GB200 Blackwell chips.
- Will require gigawatt-scale power—approaching “Hoover Dam” levels.
- OpenAI expects to operate a million GPUs by year-end—a quantum leap from 10,000 just four years ago.
3. China’s Nvidia Ban: What’s Really Happening?
- Issue: Reports claim China has “banned” Nvidia AI chips in private sector firms.
- Context and Nuance (13:48–28:57):
- Financial Times (September 2025): China’s Cyberspace Administration ordered major firms (ByteDance, Alibaba) to halt orders/testing of Nvidia’s RTX Pro 6000D—a special model tailored for China.
- The ban is not on the latest or best black-market-imported Nvidia chips (like Blackwells), but on lower-end or export-compliant models.
- Two competing hypotheses:
- (1) China is supporting local chipmakers and believes its domestic designs have caught up.
- (2) This is negotiating brinkmanship—China wants access to newer, banned Nvidia hardware (e.g., B30 Blackwell chips), using the threat of switching to domestic chips as leverage in US trade negotiations.
- Quote [16:35]:
“There is still a chance this is part of a play to get the Trump administration to approve the B30, the modified Blackwell chip that is many times more powerful than the H20... Any US China trade deal would likely include large purchase commitments from the PRC side and a massive chip order promise could help sway President Trump.” – Quoting Bill Bishop via Greg Allen - Deep divide within China:
- Chip makers/designers (Huawei, SMIC) benefit from the policy.
- Users (tech giants) strongly oppose, preferring top-shelf Nvidia chips.
- Despite progress, Huawei/SMIC chips remain stuck at 7nm—years behind TSMC/Samsung; domestic solutions are not close substitutes.
- Quote [27:10]:
“You heard it best from Deepseek when he said we are not capital constrained, we are chip constrained.” – Gregory C. Allen
4. Insights from Other Analysts & Policy Impact
-
Contrasting narratives (29:04–32:29):
- David Sachs (AI czar, Trump White House): “China is not desperate for our chips. It is producing its own and intends to compete globally...” [29:17]
- Reality check: Off-the-record, Chinese CEOs are desperate for Nvidia chips, but cannot say so publicly.
- Semianalysis: The drop in demand is policy-driven and reversible; high-level brinksmanship is possible.
- Chris McGuire (former NSC): “Huawei does not plan to produce a chip as powerful as the B30A... until Q4 2028. In 2027, Nvidia's best chips will have 27x the processing power of Huawei's best AI chips.”
-
US Policy Implications (32:33–33:39):
- Domestic politics shifting: GOP pushback to Trump admin’s limited export controls, especially around the H20.
- Trump now faces internal party obstacles to relaxing restrictions on exports of newer Nvidia chips.
5. Anthropic’s $1.5B Copyright Settlement: AI & Author Rights
- Background (34:10–44:34):
- Anthropic trained its Claude model on 7 million (mostly pirated) books; three authors sued as a class.
- Piracy itself uncontested, but legal debate centered on “transformation” and fair use.
- Judicial outcomes complex:
- Storing/using pirated books is not fair use.
- Storing purchased books, training on them is fair use (per this judge).
- Parallel suit (against Meta) produced the opposite ruling on transformation.
- Settlement: $1.5 billion (~$3,000 per infringed work); a fraction of the potentially bankrupting damages sought.
- Notable quote from Judge Alsup [39:00]:
“We'll see if I can hold my nose and approve it,” referring to the settlement.
- Stakeholder Response (40:33–42:11):
- Plaintiff Kirk Johnson: “...the beginning of a fight on behalf of humans that don't believe we have to sacrifice everything on the altar of AI.”
- Authors Guild CEO Mary Rassenberger: Recognizes high damages, but real precedent for AI/fair use are pending appeals.
- Precedent (42:11–44:34):
- Legal impact is business/procedural, not binding.
- Quote from Luke Madonna, LSE: “...may set a business precedent for similar cases,” with real legal clarity likely only from appeals or Supreme Court.
6. Chipmaker Investments: Vertical Integration & Geopolitics
- Major Deals and their Logic (44:34–53:22):
- Nvidia’s $100B in OpenAI and $5B in Intel; ASML’s €1.3B (about $1.3B) in Mistral AI (France).
- These mark a move toward vertical integration, often suppliers investing in customers or vice versa.
- Nvidia’s historic partnership with OpenAI: Supplier-customer synergy drove both to tech leadership; deepened now amid rumors OpenAI might design its own chips.
- Concern for competitors: Will Google, Amazon, others lean further into proprietary chip development to avoid dependency on Nvidia?
- ASML–Mistral AI Deal:
- ASML (Dutch EUV equipment leader) invests in highly touted French AI company Mistral.
- Ostensibly not political, but seen by many as Europe’s play for national/continental AI sovereignty and competitive relevance.
- Quote [52:23]:
“We picked Mistral because they are, we thought, the best partner to execute on what we want to do.” – ASML CEO, downplaying geopolitical aspect.
7. Nvidia’s Investment in Intel: The American Industrial Angle
- Analysis (53:22–57:06):
- US government is now a major Intel shareholder, aiming to keep high-end chip manufacturing onshore.
- Nvidia’s new Intel investment draws speculation: Is Nvidia about to make Intel its go-to US chip fab partner, thus securing both domestic manufacturing and government support?
- As of now, no official word; most of the synergy revolves around data center business and co-design of compatible systems.
- Geostrategic impact will depend on whether Nvidia commits chip production to Intel foundries.
Notable Quotes and Timestamps
- Gregory C. Allen [03:26]:
“OpenAI's expected training costs for next year are in the same ballpark as the inflation adjusted cost of the Manhattan Project. In dollar terms, that's crazy already...” - Gregory C. Allen [09:38]:
“...hundreds and hundreds and hundreds of times more aggregate computing power coming up on like a thousand times more...” - Bill Bishop, via Greg Allen [16:35]:
“There is still a chance this is part of a play to get the Trump administration to approve the B30, the modified Blackwell chip that is many times more powerful than the H20...” - Gregory C. Allen [27:10]:
“...we are not capital constrained, we are chip constrained.” - Judge Alsup [39:12]:
“We'll see if I can hold my nose and approve it.” - Luke Madonna (LSE) [42:35]:
“...while the settlement doesn't set a legal precedent, it could serve to legitimize the author’s claims and may set a business precedent for similar cases.”
Conclusion
This episode underscores the unprecedented scale and complexity of the present AI arms race. From megaproject-level private investment and colossal data centers, to China’s nuanced navigations of chip access and sovereignty, and high-stakes legal and corporate maneuvering in the US and Europe, every development is reverberating through global policy and industry. Export controls remain paramount, while unresolved legal battles over copyright and ongoing vertical integration will shape the competitive and ethical landscape for years to come.
Stay tuned for follow-ups on these rapidly unfolding stories in future episodes.
