Odd Lots Podcast Summary
Episode: Ray Wang on How AI Is Causing DRAM Prices to Surge
Date: February 16, 2026
Hosts: Joe Weisenthal & Tracy Alloway
Guest: Ray Wang, Analyst at Semianlysis
Episode Overview
This episode explores how the explosive growth of AI is fueling a massive surge in DRAM (Dynamic Random Access Memory) prices and triggering acute shortages in the memory chip market. Hosts Joe and Tracy are joined by Ray Wang, an expert analyst from Semianlysis, who delves into the supply-demand imbalance, the impact on consumer and enterprise technology, and the unique characteristics of the current ‘super cycle’ in the memory market. The discussion touches on cyclical trends, corporate strategies, potential for demand destruction, and analogies to past commodity booms.
Key Discussion Points & Insights
The Immediate Crisis: Why Are DRAM Prices Surging?
-
Supply Constraints and Demand Boom
- The supply response in DRAM has been slow due to conservative investment following the COVID-era upcycle and subsequent downcycle (05:04).
- Companies were cautious about adding capacity, anticipating a demand drop after COVID, but “the demand was accelerating so fast” due to AI that the industry was caught unprepared (05:04-06:35).
- Specialized AI accelerators, especially those using HBM (High-bandwidth Memory), are extremely wafer- and production-intensive—using up resources that would otherwise go to commodity DRAM (06:36-07:43).
-
Shift to HBM and Its Impact
- HBM is a special kind of DRAM tailored for AI’s bandwidth-hungry workloads.
- "On the same wafer basis you can produce three more bits if you do commodity DRAM, but you can only produce one bit of HBM." — Ray Wang (06:07)
- Fabrication lines are dedicating more capacity to HBM due to its profitability, further limiting commodity DRAM supply.
Memory as a Commodity: The Super Cycle
-
Cyclicality and Standardization
- DRAM is highly commoditized due to industry-standardization and minimal differentiation; market operates on thin margins and intense price competition (08:21).
- HBM is less commoditized—its complexity gives producers more room to differentiate based on technology and yields, meaning higher margins (09:52-11:31).
-
Commodity Super Cycle Analogy
- The current DRAM surge resembles commodity markets, with references to “super cycles” where demand and supply mismatches cause price spikes spanning several years (07:47-09:32).
- "It really does remind me of the oil industry...there’s a bust in the underlying commodity price...it takes a while to ramp up" — Tracy Alloway (27:04)
The Mechanisms: How Memory Is Purchased & Allocated
-
Corporate Buying & Negotiation
- Device manufacturers enter forward contracts, often renegotiated quarterly. In today's market, "securing volume" is the top priority, even over price (13:01).
-
Allocation in a Shortage
- Top-tier customers (e.g., hyperscalers and leading AI labs) get first priority for new memory supplies (31:06).
- Commodity consumer sectors like smartphones and PCs must manage with price hikes, dis-specced (downgraded) products, or outright shortages.
AI’s Explosive Memory Needs—And Why
-
Training vs. Inference
- AI training “needs tons of HBM...tons of CPU DRAM.” But even AI inference is “super memory bound and memory intensive” due to expanding context windows and growing user bases (14:43-16:08).
- "If you use ChatGPT...now people are doing, can you write a 20-page report...that's significantly longer response that you’re getting." — Ray Wang (17:04)
-
Wider Usage Scale
- AI applications have gone from small, fun use-cases to critical infrastructure (“token consumption has grown massively” — Joe, 18:08).
Demand Destruction: Who Loses Out?
-
Consumer Electronics Hit First
- Rising prices and shortages prompt device makers to cut forecasts (e.g., Mediatek’s mobile outlook down 10-15% for 2026) and delay product launches (19:10-20:49).
- “Camera is also very important...hearing some of the display on the camera as well. Also...delay launch.” — Ray Wang (20:49)
-
Impact on Gamers and Everyday Consumers
- Nintendo, Apple, and other high-profile brands are already taking margin hits or hiking prices.
- Tracy jokes: “Are we going to get people stripping old Nintendos for memory chips?” (21:19)
Can Efficiency or Recycling Help?
- Design Adjustments Limited
- Downgrading device specs is possible, but makes products less competitive (21:54).
- Real fix must come from supply-side expansions and process migrations to higher-yield memory tech, but these are bottlenecked by fab capacity and slow to ramp up (22:34-24:46).
The Supply Side: Economics and Strategic Choices
-
Producer Incentives
- With only a handful of major manufacturers (Samsung, Hynix, Micron), current high prices are highly profitable, making them cautious about rapid expansion (“enjoy massive profits”—Joe, 24:46).
-
CapEx Cycles
- Announced capacity expansions (Micron’s new fabs in Singapore and the US) won’t come online meaningfully until 2028 (25:44-27:04).
-
Balance Dilemma: HBM vs. Commodity DRAM
- Despite historic high margins for commodity DRAM, suppliers continue to prioritize HBM as the long-term growth platform and point of product differentiation (27:44-29:19).
The China Factor
- Technological Gap
- Chinese DRAM makers lag behind Korean rivals by 3–4 years, with leading Chinese suppliers largely focused on domestic markets (29:27-30:42).
- State policy backs import substitution and HBM development, but high-end capability remains concentrated in Korea and the US.
Distribution Strategies During Shortages
- Sectoral and Customer Prioritization
- Growth sectors (server DRAM and HBM for AI training/inference) are prioritized in allocation—“top, top priority because...it’s more than half of the DRAM market” (31:06).
- Commoditized, low-growth sectors (mobile) less prioritized.
The Cloud Future: Will Consumers Stop Owning Compute?
- Personal Devices vs. Cloud
- Joe muses whether we'll need powerful personal devices if cloud AIs do heavy lifting; Ray responds that requirements depend on use-case, and there’s no evidence of full structural shift yet (33:10-35:01).
Is This Time Different? The Memory Super Cycle
-
Duration and Novelty
- Unlike past cycles, AI-driven demand both accelerates demand and removes supply (fabs switch to HBM), creating a prolonged, self-reinforcing shortage (35:32-37:39).
- Previous memory cycles lasted 15–18 months; this one could stretch to 4 years (35:40).
-
When Will It End?
- Generous new capacity (due ~2027–2028) plus aggressive process upgrades might restore balance, but Ray predicts shortages will persist through 2027 (37:17-39:01).
Stockpiling and Panic Buying
- Bullwhip/Panic Behaviors
- Inventory data shows companies are aggressively drawing down stock and preemptively purchasing ahead of further expected shortages (39:06-40:18).
Effects on Hyperscalers and Tech Giants
- Price Sensitivity
- For hyperscalers, DRAM prices represent a small share of total capital expense; while price increases are a “meaningful driver,” they don’t fundamentally alter hyperscaler investment plans (40:18-42:00).
Notable Quotes & Memorable Moments
-
On supply-demand imbalance:
- “The capacity couldn’t catch all the demand...your incremental wafer capacity...is actually quite limited.” — Ray Wang (05:04)
-
On HBM’s fabrication intensity:
- “On the same wafer basis you can produce three more bits if you do commodity DRAM, but you can only produce one bit of HBM.” — Ray Wang (06:07)
-
On AI’s surging memory needs:
- “For inference...right now the most important thing will be decode, right? And decode is super memory bound and memory intensive...the memory importance...will only increase.” — Ray Wang (14:43)
-
On crowding out ordinary users:
- “It is like a fiscal boom...the pace of spending is so furious...when you start upsetting the gamer community...the visceral reality that various resources they thought they could get abundantly...nope, we’ve switched this line over to the data centers, over to AI.” — Joe Weisenthal (42:29-44:11)
-
On the uniqueness of this super cycle:
- “We rarely see in a super cycle that there’s a new demand driver coming online that not only constrains demand, but also constrains supply...that’s the key difference we’re seeing.” — Ray Wang (35:40)
Timestamps for Important Segments
- [04:44] – Introduction to the memory chip shortage and AI’s role
- [05:04] – Background on the current supply-demand crisis
- [06:07] – HBM vs. commodity DRAM & wafer constraints
- [08:21] – DRAM as a commodity: commoditization and cyclicality
- [09:52] – How HBM differs in complexity, margin, and strategy
- [13:01] – How manufacturers buy and secure memory chips
- [14:43] – Why AI is so memory-hungry: training vs. inference
- [19:10] – Evidence of demand destruction: consumer devices, price hikes
- [21:19] – Joking about stripping old devices for chips
- [22:34] – Technical and logistical barriers to increasing supply
- [24:46] – Producer incentives and capital expenditure cycles
- [27:44] – The balancing act: continuing HBM investment even as commodity margins glow
- [29:27] – How Chinese producers compare to the Koreans
- [31:06] – Supply allocation: priority sectors and customers
- [33:10] – The future: Could users do everything in the cloud?
- [35:32] – Is this a “super cycle” or a structural transformation?
- [37:17] – What could bring the super cycle to an end?
- [39:06] – Signs of panic buying and stockpiling
- [40:18] – Do memory prices move the dial for tech giants?
- [42:29] – Macro, market, and societal impacts of the shift
- [44:11] – The physical impact of AI’s “virtual” needs on real-world resources
Final Thoughts
The episode provides a comprehensive, accessible look at why DRAM prices are skyrocketing in the age of AI. Exponential demand from AI training and inference has collided with slow-growing supply, unique fabrication challenges, and entrenched industry cycles. These forces are fueling a memory super cycle reminiscent of classic commodity booms. Companies, consumers, and even governments are all feeling the “crowding out” as prized AI applications soak up ever more computational resources, with far-reaching effects across the global economy.
Recommended for:
- Investors, tech industry watchers, engineers, and anyone curious why “invisible” AI has such a tangible effect on their gadgets—and their wallet.
