Podcast Summary: The Twenty Minute VC (20VC)
Episode: "OpenAI and Anthropic Will Build Their Own Chips | NVIDIA Will Be Worth $10TRN | How to Solve the Energy Required for AI... Nuclear | Why China is Behind the US in the Race for AGI"
Guest: Jonathan Ross, Founder & CEO of Groq
Host: Harry Stebbings
Date: September 29, 2025
Episode Overview
This wide-ranging conversation with Jonathan Ross, a leading architect of AI hardware (ex-Google TPU team, founder of Groq), explores the relentless growth in demand for AI compute, why the US and its allies have a current advantage over China in the AI race, economic and technical forces driving the future of chips (including predictions for NVIDIA and custom chips from OpenAI/Anthropic), and the foundational role of energy in supporting AI advancement. The discussion also explores themes like market bubbles, labor transformation via AI, infrastructure supply chain constraints, and Europe's challenge to remain competitive.
Key Themes & Insights
1. AI Compute: The New Oil
- Ross compares the current AI market to the "early days of oil drilling," emphasizing high "lumpiness" in returns but massive upside for early players.
- Major technology firms (Google, Microsoft, Amazon) and powerful nations are aggressively investing in AI, signaling "the smart money" is all-in:
"Every time they make an announcement on how much they're spending, it goes up the next time." (05:23)
- Market value and revenue are highly concentrated: 35–36 companies constitute 99% of AI token spend today.
2. The Insatiable Demand for Compute
- The AI compute supply is massively trailing demand.
- The value from more compute is immediate; for OpenAI or Anthropic, doubling inference compute would almost double revenue in a month (12:30, 41:59):
“If OpenAI were given twice the inference compute that they have today, if Anthropic was given twice the inference compute that they have today, within one month from now their revenue would almost double.” (12:22)
3. Speed, Supply Chains, and Value
- Speed is not a “nice-to-have”—it’s core to user engagement, brand value, and winning deals:
“Every 100 milliseconds of speed up results in about an 8% conversion rate.” (13:02)
- Groq’s key differentiation is supply chain speed: they can deliver compute in 6 months, versus typical 2-year GPU cycles, which wins hyperscaler interest (24:05).
4. The Race to Custom Chips
- OpenAI, Anthropic, and other hyperscalers are expected to design their own chips, but not all will succeed:
“Building chips is hard… It’s like saying, ‘That Google search is pretty nice, let’s go replicate it.’ It’s insane, the level of optimization… You’re not going to replicate it easily.” (10:11)
- The real motivation for custom chips: control over destiny and negotiating power with Nvidia (14:47, 17:05).
- However, Nvidia’s effective “monopsony” over high bandwidth memory (HBM) makes it very hard for new entrants (14:47–18:08).
5. The Power of Energy
- Compute is limited by available energy, and nuclear/renewables are essential to powering the AI revolution:
“The countries that control compute will control AI, and you cannot have compute without energy.” (32:26, 36:46)
- Ross highlights Norway’s wind and hydro capacity and Japan’s nuclear relaunch as models for rapid change (34:12–35:14).
6. US vs. China in the AI/AGI Race
- Despite headlines, US models (like GPT variants) are up to 10x more efficient to run than Chinese alternatives, supporting the US’s “away game” (29:35, 29:50).
"The US still has a training advantage... We have a massive compute advantage." (29:39)
- China can subsidize domestic compute, but US + allies’ energy and compute efficiency is decisive for global influence.
7. Labor Transformation – More Jobs, Not Less
- Contrary to common fears about mass unemployment, Ross predicts AI will cause:
- Massive deflationary pressure, lowering costs (43:10)
- People opting out of traditional employment
- Creation of entirely new industries and job categories:
“We’re not going to have enough people… 100 years from now, jobs we can’t imagine today will exist.” (43:10–45:07)
8. Economic Perspectives & Bubbles
- AI is creating real value, not just speculative hype—PE firms measure bottom line improvements from more compute (51:46).
- Yet there is high concentration of market value in a few companies, raising risk if the growth train stalls (53:07).
9. Industry Predictions & Strategic Takeaways
- Nvidia is likely to hit a $10 trillion valuation within five years; may represent a minority of chips sold, but majority of revenue, due to pricing power and brand (65:46, 64:14).
- OpenAI, Anthropic, and others will join the “Mag 7,” growing into “Mag 9, 11, or 20” (60:43).
- Switching costs for AI tools are low for technical users, but enterprise deals still lock in customers (59:34–59:50).
- Groq's sustainable advantage: rapid supply chain and cost per token, not direct model competition.
Notable Quotes & Moments
The Compute Arms Race
"There is no limit to the amount of compute that we can use." (42:01, Jonathan Ross)
The True AI ‘Moat’
"People look at TPU as a big success... only one of [three efforts at Google] ended up outperforming GPUs... Building chips is hard." (10:11, Jonathan Ross)
On Government Response and European Energy
"Norway itself could provide as much energy as the United States and could do it consistently. The entire United States! That's one country in Europe." (33:00, Jonathan Ross)
On the Nature of Economic Cycles
“The most valuable thing in the economy is labor. And now we're going to be able to add more labor to the economy by producing more compute and better AI. That has never happened in the history of the economy before.” (51:46, Jonathan Ross)
On the Future of Jobs
“We're not going to have enough people... 100 years from now, jobs we can’t imagine today will exist.” (43:10, Jonathan Ross)
On AI Platform Wars
"Enterprises make these long term deals and they stick with whatever deal they made a year ago." (59:43, Jonathan Ross)
On Focus vs. Optionality
"I used to think that the most important thing was preserving optionality. Now I think it's focus." (74:11, Jonathan Ross)
On the Mind-Expanding Power of LLMs
"LLMs are the telescope of the mind... In a hundred years, we’re going to realize that intelligence is more vast than we could have ever imagined." (76:53, Jonathan Ross)
Important Timestamps & Segments
- AI Investment Bubble? — 05:09–07:16
- Demand for Compute & Role of Nvidia — 10:11–14:47; 41:59
- Custom AI Chips & Supply Constraints — 14:47–20:16
- The Critical Role of Energy — 32:09–37:49
- US vs China & Open Models — 27:44–30:38
- Deflation & Labor Shift Predictions — 43:10–45:07
- Market Risk & Value Concentration — 51:22–54:10
- Nvidia Future & Chip Ecosystem — 64:14–65:46
- Quickfire Round (Nvidia, Groq, Silicon, Margins, Oracle, Moats) — 70:45–74:00
- LLMs as the “Telescope of the Mind” — 76:53
Conclusion
Jonathan Ross offers a compelling narrative that places compute, energy infrastructure, and supply chain agility at the very heart of the next AI revolution. He argues the coming years will see custom chips proliferate, AI labs rise to “Mag 9/11/20” scale, and Nvidia grow even more dominant, but with room for new systems like Groq, especially as efficiency and speed shape the race. The social consequences could be epochal, with AI both lowering costs and creating labor demand, not unemployment. Throughout, Ross’s optimism about abundance, progress, and “the telescope of the mind” shines in a conversation as rapid as the industry itself.
For a full experience, key insights, and technical depth, listening to the episode is highly recommended!
