Bloomberg Tech — Special Episode: Here’s Why AI Costs Still Worry Investors
Date: November 22, 2025
Host: Stephen Carroll (Bloomberg)
Guest: Tom MacKenzie (Bloomberg Tech Europe anchor)
Episode Theme:
A deep dive into the mounting costs of Artificial Intelligence (AI) infrastructure—particularly in data centers and AI chips—and why investors remain cautious despite the sector’s massive growth and optimism.
Episode Overview
In this special crossover episode from Bloomberg’s “Here’s Why,” Stephen Carroll is joined by Tom MacKenzie to address a critical, often underplayed topic: the soaring and rapidly depreciating costs associated with building and maintaining AI infrastructure. While the AI boom has sparked bullish forecasts and massive investments from tech giants, beneath the surface, questions linger about sustainability, asset depreciation, and whether this rapid expansion echoes past bubbles in tech history.
Key Discussion Points & Insights
1. The AI Boom and Exponential Growth
- The episode opens with an analogy: “It’s 10:30pm in this AI party. It started 9pm and that party goes to 4am…”—implying we’re still early, but the clock is ticking on current AI euphoria.
- AI is described as “a qualitative leap” (01:03), resulting in exponential adoption and use across sectors.
2. Investor Worries: The Hidden Costs
- Stephen Carroll sets up the central tension: despite booming results, investors worry about running costs—namely, the short lifespan and rapid depreciation of expensive AI chips used in data centers.
- Tom MacKenzie: “We are putting mostly chips, silicon into these data centers that have a lifespan of perhaps four years.” (01:36)
- The rapid cadence of innovation drives obsolescence: “Even Nvidia, there’s a new chip every 18 months and it’s 10 times as powerful as the earlier ones.” (01:44)
- The risk: Huge capital is poured into assets (chips, data center infrastructure) that may not hold value long enough to pay off—output and adoption may not keep pace.
3. Dot-Com Parallels and Depreciation Concerns
- Tom MacKenzie references Michael Burry (of “The Big Short” fame) who has taken large short positions against Nvidia and Palantir due to:
- The potential for rapid depreciation of high-value chips (“AI accelerators”).
- Not all tech giants “are properly accounting for how quickly these assets depreciate.” (02:47)
- Concerns draw a direct analogy to the late-1990s dot-com bubble:
- Telecom infrastructure companies “spent huge amounts of money… and ended up losing a lot of money because the gains didn’t come as quickly, the technology didn’t evolve as rapidly as they had expected.” (03:28)
- Some winners (like Amazon) did emerge—but for most, capital was burned, not paid back.
4. Industry Pushback: Longer Chip Lifespans?
- Nvidia, led by CEO Jensen Huang, contends that older chips (e.g., Hopper) are “very versatile” and have “a lifespan of about six years…fully utilized by most of the companies that own those.” (04:41)
- Chips can be repurposed from training to inference, expanding their usefulness.
- Bain Capital’s analysis:
- By 2030, AI hyperscalers need to generate $2 trillion/year in revenue to justify current investment rates (05:40).
- There remains a “huge gap, hundreds of billions of dollars… between investments into AI infrastructure and the actual revenues” (05:56).
5. The “AI Agent” Bet and Sovereign AI Investments
- Tech giants predict the next phase: proliferation of “agentic AI”—AI-powered agents managing everything from holiday bookings to healthcare logistics, increasing integration into daily enterprise tasks (06:50).
- Major sovereign investments, with regions such as the Mideast, Europe, and Japan seeking their own AI infrastructure. Early days, but “all those kind of things are going to come together.” (07:14)
- The crucial question: Investors’ patience—“Will they continue to invest in the hyperscalers if they’re not seeing real material returns?” (07:32)
- Bloomberg Intelligence projects real scrutiny by end of 2026, a potential inflection point for continued investment.
6. The Cash Pile Paradox: Are the Tech Giants Really at Risk?
- Carroll asks why there’s concern given the giants’ huge cash reserves: “Why is there concern at all about how they’re going to pay for this?” (07:40)
- MacKenzie: “When we talk about hyperscalers, these are companies with massive balance sheets and huge cash reserves. These are incredibly profitable businesses… These are not non-profitable, major punts and risky parts of the market.” (07:56)
- However, the risk comes for smaller players (“Neo clouds”—data center leasers, coreweave, nClouds) and even loss-making AI startups like OpenAI and Anthropic.
- Some big names (even profitable ones) are now raising debt (“tapping the public markets”) to finance infrastructure, which raises questions about long-term leverage and risk tolerance (08:43).
- Circular deals (e.g., Nvidia invests in OpenAI, which buys chips from Nvidia) add complexity: “All of these companies become increasingly enmeshed and intertwined… on what is a bet on the future.” (09:26)
Notable Quotes & Memorable Moments
-
On the tech cycle’s late hour:
“It’s 10:30pm in this AI party…almost every investor knows it’s all going to turn into pumpkins and mice at midnight. Only as Buffett would say, no one in the room has a clock.”
—Stephen Carroll, 01:51 -
On the core depreciation issue:
“We are putting mostly chips, silicon into these data centers that have a lifespan of perhaps four years.”
—Tom MacKenzie, 01:36 -
The bull-bear split:
“Michael Burry making the argument that companies, the hyperscalers…are not properly accounting for how quickly these assets depreciate.”
—Tom MacKenzie, 02:47 -
Historical perspective:
“Comparisons…with what happened in the late 1990s…the dot com bubble, when…telecom equipment makers…spent huge amounts of money on building infrastructure…ended up losing a lot of money because the gains didn’t come as quickly…”
—Tom MacKenzie, 03:28 -
On chip lifespans and repurposing:
“Their older AI chips, one of their older versions is called Hopper, has a lifespan of about six years and is very versatile…”
—Tom MacKenzie quoting Nvidia, 04:41–05:00 -
The coming crunch:
“By 2030, the hyperscalers and other AI giants would have to be turning around revenues of about $2 trillion. And…right now, there’s a huge gap…between investments into the AI infrastructure and the actual revenues…”
—Tom MacKenzie, 05:45–06:00
Timestamps for Key Segments
- 00:42–01:24 — Introduction to AI’s rapid growth and the “party” metaphor
- 01:36–02:27 — AI chip depreciation, Burry’s concerns, and dot-com parallels
- 04:30–05:40 — AI industry’s response; repurposable chips and longer lifecycles
- 05:40–06:55 — The revenue vs. investment gap; necessity of finding “go-to-market” fit; future AI use cases
- 07:40–09:26 — Who really faces risk? Breakdown of company types and the circular nature of AI partnerships
Conclusion & Takeaways
- Despite AI’s breathtaking ascent and the confidence of major tech players, cracks are appearing in the investment story—centered on soaring infrastructure costs, asset depreciation, and uncertain near-term business models.
- For the “hyperscalers,” balance sheets are strong, but even they’re beginning to borrow heavily and engage in circular financing.
- The real crunch is projected for end of 2026, when investors will expect hard proof that massive spending translates to sustained, significant revenue.
- For now, the AI party rages on, but “almost every investor knows it’s all going to turn into pumpkins and mice at midnight”—the only question is when.
For more explanations and deep dives, visit Bloomberg’s explainer archive at Bloomberg.com/explainers.
