NVIDIA AI Podcast
Episode: GTC Live Washington, D.C. - Chapter 3: AI Infrastructure Ecosystem
Date: November 11, 2025
Host: Brad (NVIDIA)
Guests:
- Gio Albertazzi, CEO, Vertiv
- Olivier Blum, CEO, Schneider Electric
- Krishna Janalagogada, CTO, GE Vernova
- Chase Lockmiller, Co-founder & CEO, Crusoe
Episode Overview
This special GTC edition of the NVIDIA AI Podcast delves into the emerging "AI infrastructure ecosystem" — the foundational power, cooling, grid, and data center systems powering America’s (and the world’s) AI transformation. Host Brad moderates a lively panel with leading executives in energy, infrastructure, and AI data center construction. The conversation moves from urgent scaling challenges to innovation partnerships, highlighting how cross-sector collaboration is enabling the rapid buildout of infrastructure underpinning AI-driven economic and industrial change.
Key Discussion Points & Insights
The Scope of the AI Infrastructure Challenge
[02:23] Gio Albertazzi (Vertiv)
- The explosion of AI and data center investments is leading to an unprecedented need for power and cooling infrastructure.
- Building at industrial scale is necessary: “We will never be able to scale AI at the speed that we have seen…without scaling [power and thermal infrastructure] to absolutely unprecedented industrial proportions.”
- The industry is at an inflection point, requiring new, scalable, and less labor-intensive construction methods.
Memorable Quote:
“We are at a very important inflection moment in this juncture.”
— Gio Albertazzi [02:23]
Efficiency as a Core Lever, Not Just Expansion
[03:57] Olivier Blum (Schneider Electric)
- Efficiency is just as important as raw expansion: “AI depends on compute, compute depends on energy, and energy availability and efficiency depends on AI.”
- Newer AI-driven technologies (digital twins, simulation tools) help optimize every phase from design to operation, making the whole AI factory lifecycle more efficient.
Memorable Quote:
“The type of technology you have now through AI helps you to make energy more efficient…we are excited to leverage AI to make our overall industry more efficient.”
— Olivier Blum [04:09]
The Power Generation Bottleneck and Multi-Source Approach
[05:16] Krishna Janalagogada (GE Vernova)
-
Demand for power (for AI/data centers) is soaring: For the past 20 years, US power demand was almost flat; in the next 20, it’s expected to grow 50%, a third from data centers.
-
“Energy abundance is really an all of the above kind of answer right now.” Solutions include quadrupling gas turbine capacity, SMR nuclear, wind, solar, and future-proofing with hydrogen.
-
The grid is now a bottleneck—transmitting newly generated power efficiently is as big a challenge as generating it.
Notable Statistic:
“Demand is massive. We are sold out through ‘28, maybe through ‘29…quadrupling our capacity of number of gas turbines delivered by 2028 compared to 2020.”
— Krishna Janalagogada [05:41]
The Rise of the “AI Factory” & Extreme Co-Design
[08:15] Chase Lockmiller (Crusoe)
- The data center is evolving into a single, complex AI factory integrating power, cooling, storage, and silicon.
- Extreme co-design with partners like NVIDIA rewrites the rules, e.g., shifting to 800V architectures for dramatic efficiency gains, or modular, system-level design versus piecemeal scaling.
Memorable Quote:
“The process of taking electrons and turning them into tokens is just one of humanity’s greatest opportunities and challenges over the next decade.”
— Chase Lockmiller [08:50]
Modularization & System-Level Thinking
[11:12] Gio Albertazzi / [12:39] Brad
- “Think systems, not just components.”
- Data centers are being reimagined from individual hardware pieces to modular, prefabricated, and integrally designed systems (e.g., VertivOne Core).
- The next few years will see faster, more radical change in data center construction and operation.
The Role of Policy & National Urgency
[12:39] Brad / [13:45] Olivier Blum
- The US is moving quickly, but China’s speed and scale set a high bar.
- Leadership and government are now treating AI infrastructure as a national security issue, aiming to cut red tape and accelerate buildout.
- There’s a shift to collaborative “ecosystem” models—gov’t, large incumbents, and startups working hand-in-hand.
Memorable Quote:
“I think more has happened in the past two years than in the past 32.”
— Olivier Blum (on the speed of current change) [13:45]
The Vital Role of Startups & Flexible Innovation
[16:05] Krishna Janalagogada / [18:22] Panel
- Startups are crucial for hardware and software innovation (grid management, power electronics, real-time software intelligence for grid balancing).
- Large companies like GE Vernova and Schneider Electric are opening up to startup collaboration, fostering a dynamic, open innovation ecosystem.
- The next generation of data centers involves not just hardware but also software and AI-driven operating frameworks.
Memorable Quote:
“There needs to be a base of startups, inventors, creative minds, engineers out there that define what the new technology will be.”
— Gio Albertazzi [19:27]
Data Center Innovation: Power, Cooling, and Software
[20:44] Chase Lockmiller
- Data center power densities are skyrocketing: from 2–4kW/rack twenty years ago to 130–140kW/rack, soon up to 1MW racks.
- Innovations are needed across cooling architecture, networking, and systems software.
- Operating the “AI factory” efficiently is as much a software challenge (data movement, memory optimization) as it is hardware.
Notable Statistic:
“These are 1 megawatt racks. That’s a thousand homes’ worth of power in a single rack.”
— Chase Lockmiller [20:56]
The NVIDIA Ecosystem: Unique, Systemic Innovation
[23:02] Brad & Panel
-
Panelists discuss deep partnerships with NVIDIA on:
- 800V DC architectures
- Liquid cooling transitions
- Reference design for gigawatt-scale AI factories
- Software stack innovations (e.g., Lepton, Dynamo for memory & inference efficiency)
- Powergen and grid management whitepapers/reference architectures
-
Unlike other chip companies, NVIDIA unifies a vast ecosystem, leveraging both its own 35,000+ headcount and a global network of partners/startups.
Memorable Quote:
“NVIDIA…has helped unlock this collective intelligence of society, and the startup ecosystem to big enterprises.”
— Chase Lockmiller [26:19]
Systems-level Challenges: Power Swings and Grid Dynamics
[30:19] Chase Lockmiller
- Gigawatt-scale “AI factory” data centers create rapid, large power swings ("load oscillations")—a challenge for onsite generation and utilities.
- System-wide partnerships (utilities, battery storage, software for load smoothing) are key to solving these new problems.
The Scale of the Opportunity
[31:48] Brad
- The private sector is investing at a scale 10x the adjusted cost of the Manhattan Project over five years ($4 trillion).
- The decentralized, coordinated approach led by NVIDIA is unlike anything seen globally.
Memorable Quote:
“Jensen has said we’re going to spend $4 trillion over the next five years…the speed of that is…commensurate with the Manhattan Project.”
— Brad [31:48]
Timeline of Important Segments
- 00:48–01:11 — Framing AI’s transformation of the US economy, the data center as the “new factory”
- 02:23–03:32 — The power/cooling scaling inflection point (Gio Albertazzi)
- 03:57–05:16 — Leveraging AI for energy efficiency, automation, and digital twins (Olivier Blum)
- 05:16–08:15 — Multi-type power/gen + grid bottlenecks; emergence of energy “abundance” via all tech (Krishna Janalagogada)
- 08:15–12:39 — Extreme co-design, 800V architecture, modularization, “AI factory” evolution (Chase Lockmiller, Gio Albertazzi)
- 13:45–15:23 — Ecosystem collaboration with government to accelerate, national security stakes (Olivier Blum)
- 16:05–19:27 — Startups driving innovation in hardware/software, grid, and efficiency (Krishna Janalagogada, Olivier Blum, Gio Albertazzi)
- 20:44–23:02 — Data center tech leaps: 1MW racks, software-defined efficiency, new cooling/networking (Chase Lockmiller)
- 23:02–30:19 — NVIDIA’s flywheel: Partnership examples, unique ecosystem, software/hardware co-innovation (Panel)
- 30:19–32:50 — System integration challenges (power swings, utility interfacing); historic scale and coordination (Chase Lockmiller, Brad)
Notable Quotes by Timestamps
-
“We are at a very important inflection moment in this juncture.”
— Gio Albertazzi [02:23] -
“The type of technology you have now through AI helps you to make energy more efficient…”
— Olivier Blum [04:09] -
“Energy abundance is really an all of the above kind of answer right now. Right.”
— Krishna Janalagogada [06:54] -
“The process of taking electrons and turning them into tokens is just one of humanity’s greatest opportunities and challenges over the next decade.”
— Chase Lockmiller [08:50] -
“You have to be an innovator and scale. Innovation happens organically [and] inorganically. There needs to be a base of startups, inventors, creative minds, engineers out there that define what the new technology will be.”
— Gio Albertazzi [19:27] -
“These are 1 megawatt racks. That’s a thousand homes’ worth of power in a single rack.”
— Chase Lockmiller [20:56] -
“NVIDIA…has helped unlock this collective intelligence of society, and the startup ecosystem to big enterprises.”
— Chase Lockmiller [26:19] -
“Jensen has said we’re going to spend $4 trillion over the next five years…Private industry is going to spend 10x what we spent on the Manhattan Project.”
— Brad [31:48]
Closing Takeaways
- AI infrastructure is now central to economic and national security; the US is scaling rapidly but must coordinate (public/private) to keep pace globally.
- Efficiency and sustainability are as vital as expansion; AI is both a driver and enabler of this transformation.
- Innovation depends on collaboration—from startups to hyperscalers, across all layers (hardware, grid, software, cooling) and functions.
- NVIDIA’s ecosystem-centric approach is uniquely catalyzing progress, pushing both technological and organizational speed.
- We are at the dawn of a new industrial revolution, where the “AI Factory” reshapes not just computation but the entire chain of infrastructure — from electrons, to racks, to algorithms, to economic output.
For more on this topic and related episodes, visit NVIDIA AI Podcast.
