Invest Like the Best with Patrick O’Shaughnessy
Episode 442 | Dylan Patel - Inside the Trillion-Dollar AI Buildout
Date: September 30, 2025
Overview
In this episode, Patrick O’Shaughnessy interviews Dylan Patel, Founder and CEO of Semianalysis. Dylan is renowned for tracking the semiconductor supply chain and the AI infrastructure buildout with a level of detail that encompasses satellite imagery of data centers and mapping out billions in capital flows. Their expansive conversation explores the ongoing physical and economic revolution powering AI. They dig into the gigantic capital commitments from tech titans, the OpenAI–Nvidia–Oracle “infinite money glitch,” the evolving “tokenomics” of AI, reinforcement learning, power grid bottlenecks, US–China competition, talent wars, software economic models, and where real value might ultimately accrue in the AI stack.
Key Discussion Points & Insights
1. OpenAI, Nvidia, Oracle — Strategic Alliances & the AI Compute Arms Race
[05:40–10:43]
- Infinite Money Loop: The relationships between OpenAI, Nvidia, and Oracle were described as a "Spider-Man meme," where each company is both a supplier and a customer, producing a kind of “infinite money glitch.”
- The Stakes: The AI game is driven by colossal capex—billions to hundreds of billions—devoted to building compute clusters before clear business cases always materialize.
- Capital Flows and Risk: Companies like Oracle take on huge capex, betting that OpenAI’s deals will pay off. Nvidia participates both as a supplier and equity investor, capturing huge upfront profits.
- Scale is Everything: OpenAI’s unique advantage was a willingness to go bigger early (GPT-3 and 4), but now the giants—Meta, Google, Microsoft—threaten to outscale them fast. "There’s very much a risk of OpenAI being too small to matter, which is crazy to say because they've got 800 million users. But where is the revenue? Where’s the compute?" (Dylan, 07:18)
2. Scaling Laws, Diminishing Returns, and Model Economics
[10:43–14:52]
- No True Diminishing Returns—Yet: The log-log scaling of model performance means ever larger, more capable AIs are possible, but at exponentially larger cost.
- "You need 10x more compute to get to the next tier of performance... But what if that next tier is the equivalent of a 6-year-old versus a 16-year-old?" (Dylan, 08:07)
- Economic Anxiety: Financial types fear the moment ROI breaks down—if we make gigantic bets and the models stop yielding large jumps in value, there's huge downside.
- Inference Bottlenecks: Larger models often can’t be served quickly or cost-effectively enough for broad adoption.
3. Tokenomics: The Economics of AI Tokens and Usage
[17:01–23:28]
- Defining Tokenomics: A play on crypto’s terminology, “tokenomics” here means the economic value created by each token processed or generated by AI.
- User Experience vs Model Size: There's a tradeoff: bigger/fancier models have higher value but slower response and higher cost; speed is vital for UX.
- Infrastructure Lag: Demand for inference tokens doubles every two months, but hardware isn’t doubling as fast—necessitating cost reductions and efficiency gains.
4. Major Bottlenecks: Latency, Capacity, and Reinforcement Learning Environments
[23:28–32:56]
- The Key Constraint is Capacity/Cost: Lowering latency is helpful, but being able to serve many more users efficiently is even more important.
- Why Bigger Isn’t Always Better:
- "The challenge is not necessarily to make the model bigger. The challenge is how do I generate data in useful domains so the model gets better at them?" (Dylan, 25:46)
- Environments for RL: Massive innovation is happening in simulated “environments”—giving models places to learn (like fake Amazon stores, spreadsheets, math puzzles)—to supplement sparse, real-world data.
5. How Far Along Are We? State of Pretraining and RL
[29:09–32:30]
- Pretraining: For text, we’re maybe in the “late innings;” for images, audio, and video—much earlier.
- Reinforcement Learning: "I think we've like thrown the first ball." (Dylan, 31:04) The space is extremely early, akin to a baby learning by putting a hand in its mouth.
6. The Future: Utility, Reasoning, and Embodiment
[32:30–38:47]
- Automation of Actions: The leap from models organizing information to doing things autonomously is close (e.g. shopping agents, automating purchases).
- "More than 10% of Etsy’s traffic is straight from GPT... Amazon blocks GPT, but otherwise it would be really high." (Dylan, 33:10)
- Reasoning as Compute: Time spent reasoning (not just bigger models) is a distinct vector of improvement. RL and environments are closely linked.
- Embodiment & Memory: Advanced intelligence may require “embodiment,” i.e., linking AI to physical world experience; models struggle with having "long-term memory" akin to humans.
7. Bullishness vs Bearishness on AI's Trajectory & Economic Impact
[43:32–46:39]
- Enormous Upside Even Short of AGI: Even if models plateau far from digital “God,” massive economic value (example: mainframe migrations, software development, automation) will be created.
- Ultimate Bull Case: The "upper limit" is machines simply smarter than humans (not soon), but the path already promises transformative change.
8. Talent Wars and Research as a Bottleneck
[47:49–54:58]
- Value of Process Knowledge: AI and semiconductor fields hinge on rare process knowledge. Huge salaries—up to $100M+—are justified given the impact one researcher or engineer can have on capex running into the tens of billions.
- "It is infeasible. How could this person possibly be worth that much? Well, they're running the experiment on chips that cost $100 billion." (Dylan, 48:23)
- US–China Talent Dynamics: US should "acqui-hire" elite global technical talent; China excels at amassing and training engineers, often without the salary inflation.
9. Where Will Value and Power Accrue?
[56:12–65:12]
- Stack Power Shifts: Today, Nvidia is king—harvesting the majority of upstream capital. However, power shifts over time as platforms, data, and IP become more important.
- Frenemies Everywhere: Firms like Anthropic, Cursor, OpenAI, Microsoft, and Nvidia are simultaneously partners and rivals—“the most fascinating soap opera ever” (Dylan, 59:16).
10. Capex Cycles, Bubbles, and Overbuild Risk
[65:12–69:05]
- Historic Infrastructure Cycles: As with past booms in railroads and fiber optic cables, there’s the risk that massive AI capex will lead to overbuild and a subsequent glut.
- "If the models don't improve, yes, we will overbuild... US economy will go into a recession straight up because of this." (Dylan, 65:39)
- But Demand May Be “Infinite”: If scaling laws hold, the demand—especially in high-value roles—could justify the buildout for a long time.
11. Middle Layer & Cloud Business Model Innovation
[69:05–72:53]
- “Neo-Cloud Models”: Firms like Nebius are making billions by signing long-term GPU rental contracts with tech giants; but they bear risk if upstream customers (e.g. OpenAI) cannot pay.
- Profits Funnel Upstream: Ultimately, Nvidia captures much of the economic rent.
12. Applications, Disruption of Deterministic Code, and Real-world Impact
[72:53–77:15]
- Beyond Skeuomorphism: We’re still mostly automating old software problems (e.g. making devs more productive), but new capabilities are emerging—like automated research, mainframe migration, and pattern detection—that were barely possible a few years ago.
- Dylan’s own business now leverages AI for regulatory analysis, satellite identification, and more, with a team of three—unthinkable without recent advances.
13. Power, Data Centers, and Physical Infrastructure
[77:15–83:47]
- Raw Power Needs: AI data centers are ramping up US power demand, but are still a small share (~2%) of total. However, building new power infrastructure, especially with labor and regional bottlenecks, is an urgent challenge.
- "Electrician wages have, like, doubled for mobile electricians that can work on data center stuff." (Dylan, 78:53)
- Creative Solutions & Supply Chain Quirks: From diesel truck engines arranged in clusters, to power equipment air-shipped from Poland, the sector is seeing atypical solutions emerge as traditional supply chains buckle.
14. US vs China: Competition at Every Stack Level
[83:47–94:57]
- Core Differences: China plays the long game (e.g., in EVs), focuses on supply chain sovereignty, and can build physical infrastructure much faster than the US.
- Why America Needs AI: Patel argues without runaway AI, the US will lose its global dominance due to slowing growth, high debt, and social instability.
- "Without AI, we're definitely going to lose. Our supply chains are slower, they cost too much, our debt is unsustainable, our economy's not growing fast enough to maintain the level of debt." (Dylan, 84:14)
- Geopolitical Risk: TSMC in Taiwan is a massive potential chokepoint for both US and global tech.
15. Startups, Hardware, and New Bottlenecks
[96:42–104:34]
- Physical World Breakthroughs: Startups like Periodic Labs are focusing on applying RL to chemistry/material science, where small breakthroughs (like in battery chemistry) can have outsized impacts.
- Hardware Complexity: While Nvidia is king, much of the stack (e.g., transformers, networking, optics) is ripe for innovation. Some startups focus on “world models” that can simulate anything from molecules to robotic environments—possible accelerants to the AI buildout.
- Accelerator Chips Risks: Competing with Nvidia in hardware is difficult, given the capital intensity and lead-time risk.
16. Quick Takes on Major Companies
[104:44–111:58]
- OpenAI: Still “super awesome,” but slightly less focus than Anthropic on the lucrative software market.
- Anthropic: "Their revenue is accelerating way faster... more relevant to that $2 trillion software market..." (Dylan, 104:58)
- AMD: Has a soft spot for being the perennial underdog, innovative but "pretty mid" in the AI race.
- X.ai (Elon Musk): Faces fundraising and business model risks, despite advantage in assembling a mega data center.
- Oracle: Staggeringly levered to OpenAI’s success. "If you believe OpenAI is successful... Oracle’s gonna make so much fucking money." (Dylan, 107:37)
- Meta: "So close to being the only company that can do [dominate new interface paradigms]." (Dylan, 110:04)
- Google: Waking up fast, “super bullish.” Unlike Meta, better positioned to serve both consumers and professionals with integrated hardware and AI.
17. Software Business Models Must Change
[112:08–117:30]
- SaaS Is Breaking Down: AI drastically lowers the cost of software development, but dramatically raises COGS (cost of goods sold) for AI SaaS. Customer acquisition remains tough, so economics may never hit the same profit leverage.
- "The era of software-only businesses is really, really tough. In the age of AI now, already scaled businesses can do great..." (Dylan, 115:21)
Notable Quotes & Memorable Moments
On the AI Buildout Stakes
“This is about the highest stakes capitalism game of all time.”
—Patrick, [17:01]
On Diminishing Returns
"What if the next tier of performance is like a 6-year-old vs a 16-year-old?"
—Dylan, [08:07]
On Tokenomics
"Hopefully everyone using tokenomics 20 more times. It’s got to be in the title now."
—Dylan, [17:53]
On Power Infrastructure
"Electrician wages have, like, doubled for mobile electricians that can work on data center stuff. If you're down to move to West Texas, it's like 2015 again."
—Dylan, [78:53]
On the Talent War
"It is infeasible. How could this person possibly be worth that much? Well, they're running the experiment on chips that cost $100 billion."
—Dylan, [48:23]
On US–China AI Geopolitics
"Without AI, we're definitely just going to lose... The US would literally fall apart if we don’t do something—and by do something, I mean AI has to dramatically accelerate GDP growth."
—Dylan, [84:14]
On SaaS and AI
"I think the era of software-only businesses is really, really tough in the age of AI now... Many software businesses will have a reckoning."
—Dylan, [115:21]
Important Timestamps & Segments
- OpenAI–Nvidia–Oracle Logic & Capital Loops: [05:40–10:43]
- Scaling Laws & Model Economics: [10:43–13:32]
- Tokenomics & Inference Demand: [17:01–23:28]
- Model Bottlenecks & RL Environments: [23:28–32:56]
- Bullish vs Bearish AI Outlook: [43:32–46:39]
- Talent Wars, Process Knowledge: [47:49–54:58]
- Stack Power Dynamics: [56:12–65:12]
- Capex Risks & Bubble Concerns: [65:12–69:05]
- Application Layer Impact & AI-Enabled Research: [72:53–77:15]
- Power Grid & Infrastructure Challenges: [77:15–83:47]
- US vs China Stack Analysis: [83:47–94:57]
- Hardware Startups & Bottlenecks: [96:42–104:34]
- Rapid Fire on Major Players: [104:44–111:58]
- SaaS Model Transformation: [112:08–117:30]
Final Thoughts
Dylan Patel’s realism and granular market perspective expose the thrilling, high-wire act that is AI’s industrial buildout. While bullish long-term on the technology, he is sober about the capex risks, bottlenecks, and software model disruptions ahead. The US faces a “must-win” race against China, not just for global leadership, but for economic stability. In the meantime, the fortunes and failures of the world’s largest companies are being shaped by trillion-dollar bets on semiconductors, power, and AI’s rapidly mutating stack.
For more in-depth discussions and profiles of leaders shaping business and investing, visit joincolossus.com.
![Dylan Patel - Inside the Trillion-Dollar AI Buildout - [Invest Like the Best, EP.442] - Invest Like the Best with Patrick O'Shaughnessy cover](/_next/image?url=https%3A%2F%2Fmegaphone.imgix.net%2Fpodcasts%2F799253cc-9de9-11f0-8661-ab7b8e3cb4c1%2Fimage%2Fd41d3a6f422989dc957ef10da7ad4551.jpg%3Fixlib%3Drails-4.3.1%26max-w%3D3000%26max-h%3D3000%26fit%3Dcrop%26auto%3Dformat%2Ccompress&w=1200&q=75)