Podcast Summary
Episode Title
20VC: Cerebras CEO on Why Raise $1BN and Delay the IPO | NVIDIA Showing Signs They Are Worried About Growth | Concentration of Value in Mag7: Will the AI Train Come to a Halt | Can the US Supply the Energy for AI — with Andrew Feldman
Podcast: The Twenty Minute VC (20VC)
Host: Harry Stebbings
Guest: Andrew Feldman (Co-founder & CEO, Cerebras)
Date: October 6, 2025
Episode Overview
In this episode, Harry Stebbings hosts Andrew Feldman, CEO and co-founder of Cerebras, to discuss the company’s historic $1.1 billion Series G fundraising round, the dynamics and future of the AI hardware and infrastructure sector, NVIDIA’s dominant role, the insatiable and unpredictable demand for AI compute, data center and energy challenges, and the geopolitical and societal impacts of the rapid AI revolution.
Feldman delivers frank, in-depth insights on market realities, AI chip innovation, talent wars, concentration of value among the “Mag7” tech giants, and the necessity for society to see real returns from vast investments in AI. The conversation is fast-paced, intellectually rich, and filled with memorable moments and actionable takeaways.
Key Discussion Points and Insights
1. The $1.1BN Raise: Timing, Investors, and Strategy
[04:04–06:22]
- Historic Nature: Cerebras’ $1.1 billion Series G at $8.1B valuation is the largest ever in their space, done at the highest valuation, and includes premier investors like Fidelity, Tiger Global, and Valor.
- Why Now?
“We now have sort of the dry powder to really push…to build out our manufacturing to the scale and scope we want, to add new data centers…We have more big ideas."
– Andrew Feldman [04:12] - Signal Value of Investors: Fidelity’s participation is highly strategic — "they are the Oxford or Cambridge of investing…when they choose to lead a round, it brings Wall Street a great deal of confidence." [04:12]
2. The Decision to Stay Private a Little Longer
[05:53–06:22]
- Pre-IPO Round Logic:
"It's very common in late stage to do a pre IPO round if you can get it done very quickly, if it doesn't distract you and keep moving."
– Andrew Feldman [06:00] - Retaining flexibility, seizing near-term opportunities.
3. The State of the AI Market: Hype, Exponential Demand, and Uncertainty
[06:22–09:34, 09:42–10:10]
- Breathtaking Uncertainty:
“Things are moving at a rate that 6, 8, 12 months out. Everybody's unsure. It's so fast, it's so big."
– Andrew Feldman [00:00, 06:40] - Marketing vs. Reality: Feldman warns of inflated announcements (“up to $100B over 5 years”) and the need for realism about timeframes and actual value.
- Options, Not Commitments: Customer deals signal “options on the future” as nobody can accurately predict AI demand; companies are hedging rather than guaranteeing usage.
- Demand Underestimation:
“100%. I've been wrong. If you would have said a year ago…that OpenAI would get the valuations they’re getting…it wouldn’t have been conceivable.”
– Andrew Feldman [09:42]
4. Is the AI Boom Sustainable?
[10:20–11:15]
- Most AI experiments fail, but betting on “permanent Goliath dominance” is a zero-alpha game. Feldman’s bet: the economic pie (via AI) will get larger and reorganize society.
- NVIDIA’s Growth:
“Of course, if Nvidia keeps growing at the rate they're currently growing, 11 years from now, everybody on earth works for them.”
– Andrew Feldman [10:20]
5. NVIDIA’s Dominance and The Big Company Playbook
[11:35–13:07]
- Sign of Concern:
“You use your balance sheet more and your technology less…you start buying business as opposed to winning business.”
– Andrew Feldman [12:11] - Tactics like predatory pre-announcements (e.g., B300s ahead of B200s) seen as evidence Nvidia is preparing for a more competitive future.
6. Decoding the OpenAI–NVIDIA $100B Deal
[13:07–14:53]
- Feldman asserts the deal structure was “designed for nobody to understand it”—deliberately vague and not truly analyzable.
- Analogy: “My mother goes shopping…I ask her, it's a lovely dress, Jules, how much is it? Well, it doesn't matter. It doesn’t matter."
– Harry Stebbings [14:10]
7. Chip Depreciation, Performance, and the Physics of Progress
[14:53–17:57]
- Depreciation Reality:
“People are clearly still getting value from H100s…and from A1000s…The question is: how much faster are future generations than the current generation? That's the actual question on depreciation.”
– Andrew Feldman [15:17] - Performance Increments: Marketing overstates improvements; real-world is typically “2-2.5x per meaningful generation move” [16:52].
8. SRAM, Memory Architecture, and Wafer Scale Innovation
[17:57–21:06]
- SRAM’s Limitations & Opportunity: SRAM is fast but small capacity – “your friend is exactly right…it's the reason we went to waferscale." [18:32]
- Cerebras overcame the 75-year “impossible” challenge of building dinner-plate-size chips – "what we were missing is that for 75 years nobody could do it…After we did it, Elon tried dojo and they failed…It was obvious, it was hard." [20:34]
9. Training vs. Inference: Market Split
[21:06–22:46]
- Cerebras Claims Leadership in Both:
“No, we're faster on both. But the software challenges in training are real…in inference, nobody cares about Cuda, nobody even cares about Pytorch…put up a side by side to show you’re faster than 1000 B200s.”
– Andrew Feldman [21:20] - Higher friction to disrupt training workflows; inference is easier to switch and is a larger market.
10. The Exponential Growth of Inference
[22:46–23:54]
- “It’s really hard for the mind to wrap itself around geometric growth or exponential growth. More people are using AI…they use it more frequently…and what they want to do…is bigger.”
– Andrew Feldman [22:52] - All three factors are multiplying, driving staggering demand.
11. Productivity, AI Usage, and the Robert Solow Paradox
[24:08–26:13]
- Historical analogy to computers and electricity: real productivity comes only when society reorganizes around the new tech, not just replaces old processes.
- “If we use OpenAI the way we use Google, you’ll see a very modest jump in productivity. If you use them in a fundamentally different way…you’ll see a huge jump.”
– Andrew Feldman [24:08]
12. Energy Demands: Is the US Ready?
[26:13–27:38]
-
“People often say we don’t have enough power in the US and this is strictly wrong. We have plenty of power. It's in the wrong places."
– Andrew Feldman [26:27] -
The challenge is a mismatch: power’s where people and fiber aren’t.
-
Responsibility:
“To the extent we consume this extraordinary amount of power, we have an obligation to deliver amazing things…If we use it and don't do that, then it's not a gain for society."
– Andrew Feldman [27:38]
13. The Messiness and Market-Driven Exploration of AI Use
[27:51–29:06]
- Markets will fund wasteful, silly projects (“Ghibli images”), but sometimes what looks wasteful later seeds true advancement.
14. Geopolitics: US Policy, China, Immigration, and Energy Infrastructure
[29:06–31:03, 44:36–48:45]
- US Needs to do More:
“We have real work to do…Our decentralized form of government has left us with a patchwork of power infrastructure…We have starved our universities of compute…”
– Andrew Feldman [46:18, 48:17] - Immigration:
“We’re not making enough AI practitioners…Historically sucked the best and brightest on J1s and H1s…we need to do a better job.” [33:07, 47:23]
15. The “Mag7,” Market Concentration, S&P 500, and Risk
[31:03–32:44]
- “The risk is the mismatch in the mental model people have…they thought they were diversified and in fact they’re heavily dependent on a very narrow sector.”
– Andrew Feldman [31:03] - On NVIDIA:
“They've proven themselves to be extraordinary company in the first quarter of the century. I don’t know if $4T is right, but maybe it’s too low.” [32:25]
16. Bottlenecks in AI Scale
[33:07–36:29]
-
Expertise: Not enough AI/data talent produced by US universities; immigration policy makes it worse.
-
TSMC & Manufacturing:
“TSMC can’t build fabs fast enough…$30B, $50B factories…their ability to build them quickly enough is limited.”
– Andrew Feldman [35:44] -
Data Center Capacity:
“Where are those gigawatt facilities that everyone’s talking about? Well, they’re not up yet.” [36:29]
17. Data Centers and Investment Frenzy
[36:45–38:18]
- Data centers are a Wall Street favorite because they “look like a bond” (rent, stable tenants).
- “Building data centers is not for everyone…the ways to lose money in property are large and many…when you go unbelievably quickly it’s harder and harder to be disciplined.”
– Andrew Feldman [37:43, 38:18]
18. Vertical vs. Horizontal Integration: Do AI Labs Need to Build Their Own Chips?
[38:18–41:50]
- Most successful AI labs (OpenAI, Anthropic) are not vertical – they use public cloud. Whether that remains true is an open question.
- On Chip-Building by Software Companies:
“There is a long history of software companies failing to build chips…the answer is this is really hard…mentality is very different—it’s not ‘move fast and break things’.”
– Andrew Feldman [39:26–41:50]
19. What Will the AI Chip Market Look Like in 10 Years?
[42:03–42:39]
- “Absolutely not one takes 90%…even at Intel’s strength they had dominance in x86 and 0 market share in cell phones…It will not all accrue to one or two companies.”
– Andrew Feldman [42:10]
20. Margins, Pricing Power, and the Coming Shakeouts
[42:43–43:50]
- Cerebras raised more due to positive margins; NVIDIA’s margins are “extraordinary”—as high as 78–85% for certain products.
- “When the giant stumbles, the number of people who come out of the woodwork to kick them when they're down is extraordinary…it was years of pent up frustration…”
– Andrew Feldman [43:12]
21. Sovereignty as an AI Business Model
[43:50–44:36]
- Feldman sees regional sovereignty (e.g., Mistral in Europe) as a viable way to gain traction when combined with real product advantage.
22. The US–China AI Race
[44:36–46:18]
- “I think it benefits neither…we would be much stronger if we can find ways to peacefully engage…their government has an extraordinarily aggressive policy in AI…we have real work to do.”
– Andrew Feldman [44:57]
23. The War for Talent
[34:14–35:41]
- “No company ever went bankrupt by paying extraordinary people too much. If you want to go bankrupt, pay mediocre people too much.”
– Andrew Feldman [35:13] - The compensation for elite technical talent is rational due to their “unreplicable” impact.
Notable Quotes & Memorable Moments
Memorable Quotes
-
On Planning for Uncertainty:
"You plan more frequently, you have a shorter view, you take options on the future and, if the future moves against you, you lose the premium on the option."
– Andrew Feldman [08:39] -
On Market Hype:
"The great sort of CYA word in marketing history is ‘up to 100 billion over five years'.…You could pick a lot of big numbers and it won't be bigger than."
– Andrew Feldman [06:40] -
On Software Companies Building Chips:
"Modern software does not fit well in a chip making framework…Weekly sprints don't work well on two year long projects."
– Andrew Feldman [39:26] -
On AI’s Societal Obligation:
"If we are going to consume this amount of power, the burden is on us to deliver value for it."
– Andrew Feldman [27:38] -
On Talent:
"There are engineers who have skills that no number of other engineers working together can achieve…It's why the best and the brightest are getting such extraordinary compensation."
– Andrew Feldman [34:25] -
On Leadership Challenges:
"Every day I go to battle with Goliath…Every dollar we sell is a dollar that if we didn’t work at it, …would default to Nvidia."
– Andrew Feldman [59:49]
Humorous, Relatable Moments
-
On the Demand for AI Compute:
"We have customers coming to us and saying we would like between 5 and 40 million queries per second. Well, how do you not know by a factor of 35 million queries per second?"
– Andrew Feldman [06:40] -
On Market Announcements:
“My mother goes shopping and I ask her, it’s a lovely dress, Jules, how much is it? Well, it doesn’t matter. It doesn’t matter.”
– Harry Stebbings [14:10] -
On The First Cerebras Wafer-Scale Chip Succeeding:
"...founders stood there together and stared at the box running, which is about as interesting as watching paint dry. And we stood there and we couldn't believe it."
– Andrew Feldman [53:00]
Timestamps for Important Segments
- 04:04 – Why raise $1BN? Why now?
- 06:22 – State of the AI market, demand uncertainty
- 08:39 – Planning amid exponential uncertainty
- 10:20 – Sustainability of the AI boom
- 11:35 – NVIDIA’s evolving competitive strategy
- 13:07 – Analyzing the OpenAI–NVIDIA partnership
- 15:17 – Chip depreciation and generational improvements
- 18:32 – SRAM, architecture, and reasons for Cerebras' wafer-scale innovation
- 21:20 – Training vs. inference, Cerebras' comparative performance
- 22:46 – The exponential growth of AI inference demand
- 24:08 – Productivity, computing, and historical analogy
- 26:27 – US power supply and energy for AI
- 31:03 – Concentration risk in the S&P/“Mag7”
- 33:07 – Key supply and talent bottlenecks
- 39:26 – Why software companies struggle (and often fail) in chip design
- 42:10 – The long-term landscape for silicon/AI chips
- 43:08 – NVIDIA's margins and impact on industry structure
- 44:57 – Geopolitics, China–US competition, and relevant policy
- 46:18 – US weaknesses: immigration and infrastructure
- 53:00 – “Moment of victory” for wafer-scale chip
- 54:52 – Where should investors look next? Data pipelines!
- 56:42 – Will AI create labor shortages? Feldman's skeptical take
Conclusion & Takeaways
Harry and Andrew pack this episode with gems for founders, investors, policymakers, and technologists:
- Understand the difference between hype and operational reality.
- Appreciate the nonlinear, option-laden nature of AI demand and the importance of agility in planning.
- NVIDIA’s dominance is real, but the playbook of giants contains the seeds for future disruption.
- Data center build-out, manufacturing, and talent remain the real bottlenecks for scaling AI.
- Societal returns must justify AI’s massive energy and economic footprint.
- The future belongs to those with patience and a fundamental understanding of both tech and markets—not just those following momentum.
- AI’s adoption curve will mimic that of other fundamental technologies: real transformation comes only when we reorganize around it, not when we simply replace old tools.
Further Listening & Resources
For more on this and similar topics:
- Visit 20VC.com
- Search for previous founder episodes with guests like Daniel Ek (Spotify), Reid Hoffman (LinkedIn), and Frank Slootman (Snowflake).
