Prof G Markets — "Why Oracle is Crashing Right Now"
Date: December 16, 2025
Hosts/Guests: Ed Elson (Host), Gil Luria (Head of Technology Research, DA Davidson), Helen Toner (Executive Director, Georgetown Center for Security and Emerging Technology)
Episode Overview
This episode explores the dramatic drop in Oracle’s stock price following an earnings miss and deepening controversy around its massive, highly-publicized deal with OpenAI. Ed Elson dives into what triggered the selloff, Oracle’s debt situation, and what this reveals about the broader AI bubble, with insights from Gil Luria. The second half unpacks President Trump’s new executive order blocking state-level AI laws with Helen Toner, offering analysis of the political and regulatory landscape for AI in America.
Oracle's Huge Stock Drop: Dissecting the Crisis
Segment Start: 02:07
The Setup: Market Volatility & Oracle’s Tumble
- Ed Elson recaps a jittery market: “Bitcoin dropped back below $86,000. Broadcom suffered its worst three day selloff since 2020. ... Oracle stock has taken a serious tumble since last week's earnings miss.” (02:07)
- Oracle’s share price is off 40% from its September peak, following a missed earnings report and delays in key data center builds linked to OpenAI.
Main Concerns Behind the Selloff
Guest: Gil Luria, Head of Technology Research, DA Davidson
- Luria points to the enormous risk Oracle took with its OpenAI deal:
- “OpenAI promised them, we’re going to spend $300 billion with you over the next five years. …But we found out…OpenAI was making a lot of commitments to a lot of companies, $1.4 trillion in total, which…they didn’t actually intend to live up to.” (03:45)
- Oracle must borrow heavily to build data centers for contracts that may not materialize, pushing its debt to the limits.
- The company insists on keeping an investment-grade rating and debt/EBITDA below 3.5, but “we’re going to have to find creative solutions to do that.” (04:55)
- Luria highlights the “uncomfortable position where it’s unclear how they can proceed moving forward, building for a customer that may or may not materialize.” (05:31)
Deeper Dive: Why the Panic Now?
- Elson observes: “All of that stuff sounds like stuff we kind of already knew... Is there something that made people believe all that? Maybe more strongly or some number that particularly stood out?” (05:42)
- Luria explains:
- Investors punished Oracle not just for the revenue miss, but because they failed to keep up with their own data center buildout promises—negating their value prop against Amazon, Google, Microsoft (06:24).
- Crucially, Oracle doubled down on optimism in its earnings call, rather than acknowledging the risk of the OpenAI contract. “Instead of…own[ing] up…they got entrenched in this notion that everything’s fine... I think that in itself caused unease.” (07:19)
The Debt Spiral & Credit Default Swaps
- Elson points out: “Long term debt increased to $116 billion, up 44% from a year ago. ...Credit default swap spreads…have hit an all time high.” (07:45)
- Luria: Oracle’s leases plus debt represent “a colossal amount”—$246 billion in operating leases plus $108 billion of debt.
- “That tells us that their debt is now worth less than when they issued it… next time they have to raise debt, it’s going to be harder and more expensive.” (08:16)
Oracle as an AI Bubble Proxy
- “It really seems as though Oracle has become kind of the proxy for the whole AI bubble...” (09:10)
- Luria: Until Oracle “gets out of this predicament…renegotiating with OpenAI to something more reasonable, investors are going to be very cautious…” (10:09)
- “Oracle was misled by OpenAI and then Oracle misled investors. There’s always a discount when you mislead investors, and that tends to linger for more than a couple of months.” (10:48)
Spillover to OpenAI’s Reputation
- Elson: “Does this mean OpenAI, if it were trading, would be down like 50% at this point?”
- Luria: “Most likely. ...Within the next couple of weeks, we’re going to find out how much OpenAI was able to raise. If they raise $5 billion or $10 billion, that's not nearly enough… But if they show up…with $100 billion…there’s going to be a big relief in the entire sector. I suspect…we may have the first scenario, not the second...” (11:28)
Notable Quote
- Gil Luria, on the AI contract mania:
“OpenAI was making a lot of commitments to a lot of companies, $1.4 trillion in total, which…they didn’t actually intend to live up to.” (03:49)
Federal vs. State AI Regulation: Trump’s New Executive Order
Segment Start: 15:33
The Move: Trump’s Executive Order
- President Trump signs an EO aimed at blocking state AI laws to create a unified federal framework (15:33).
- Ed Elson brings in Helen Toner for analysis.
How the Executive Order Works (or Doesn’t)
Guest: Helen Toner, Georgetown CSET
- “This is not the first move…the Trump administration did not do this out of the blue…” Earlier attempts at preemption failed by near-unanimous votes in the Senate.
- The EO is “really trying to do an end run around Congress…trying to intimidate states, it’s trying to threaten legal action against them. But it’s all going to come down to messy court fights…” (16:26–18:24)
Should AI Be Regulated at the State or Federal Level?
- Toner: “There is no one approach…because AI is so many different things.” (18:56)
- State regulations often make sense for how state governments use AI.
- For national security, federal oversight may be better—but states may need to act when Congress won’t.
- “I tend to think that the laboratory of democracy idea is valuable and is a sort of foundational principle of how the United States works.” (19:52)
A Unified Approach? Lessons from China
- Trump: “China has one vote because they have one vote, and that’s President Xi. He says, do it. And that’s the end of that.” (20:32)
- Toner rebuts: “In China I think it’s not quite that simple...But of course it is an authoritarian system and Xi Jinping does wield an enormous amount of power. … fascinating to see this from the leader of the party that for so long has protected states rights…” (21:23)
Political and Public Attitudes on AI
- Elson: “Americans do not like AI increasingly so. And so it does seem striking that Trump…is seemingly taking the other side of this. He seems to be siding with Silicon Valley in a lot of ways...” (22:29)
- Toner:
- “It’s really striking…how negative public sentiment about AI is…People are very negative on self driving cars, for example, which I think are an incredible technology when you look at the safety data and how many lives…could be saved.” (23:30)
- “I think banning state legislatures from creating any rules…is not how you create that confidence. So I definitely think there’s a chance that we’ll see significant backlash to AI over the next few years. And this isn’t going to help prevent that.” (23:50)
Notable Quotes
- Helen Toner, on regulating fast-moving AI:
“Do you then say we should tie the state’s hands and they shouldn’t be allowed to do anything while D.C. goes ahead and does nothing? And that’s…I think, where you have some legitimate disagreement.” (19:38) - On Trump’s stance:
“It’s fascinating to see this from the leader of the party that…has protected states rights…But of course the CEO has now gone through and the federal government is going to move to execute on it.” (21:42)
IPO Update: Fermi’s Collapse
Segment Start: 24:32
- Ed Elson recaps Fermi, an AI data center company IPO’d in October at $19B, and his previous prediction: “We said that it was a, quote, shit show of a company. ...We said that ultimately the stock would come crashing down.” (24:32)
- Fermi stock is now down nearly 75% after customer contracts fell apart and no real revenue materialized.
- “They haven’t executed anything. They haven’t built anything...have no profits...they still have no revenue. So it appears that Wall Street is beginning to come around on these issues. The Stock’s down nearly 75%. We predicted this would be something of a disaster.” (24:50)
Notable Moment
- Elson: “The only wrinkle, it happened a lot sooner than we thought.” (25:52)
Key Timestamps
| Timestamp | Segment | Key Content | |-----------|-------------------------------------------|-----------------------------------------------------------------------------------------| | 02:07–05:35 | Oracle’s Tumble – The OpenAI Context | Oracle’s risky OpenAI deal, leverage, investor anxiety | | 05:42–08:55 | What Triggered the Panic | Data center delays, entrenched leadership, debts on the rise | | 09:10–11:28 | Oracle as AI Bubble Proxy, OpenAI Spillover | Oracle as symbol for AI sector risk, OpenAI’s funding woes | | 15:33–18:24 | Trump’s AI Executive Order | National vs. state regulation, political motivations, EO mechanics | | 18:54–21:23 | How Should AI Be Regulated? | Value of state regulation, national security, realpolitik of federal/centralized rules | | 21:23–24:27 | Politics & Public Sentiment on AI | Trump’s China comparison, party politics, risks of backlash | | 24:32–25:52 | Fermi IPO Update | Prof G’s prescient call on an AI data center blowup |
Memorable and Notable Moments
-
On Oracle’s Leadership
"Oracle was misled by OpenAI and then Oracle misled investors. There’s always a discount when you mislead investors, and that tends to linger…”
— Gil Luria (10:48) -
On Federal Overreach
“It’s fascinating to see this from the leader of the party that for so long has protected states rights and has understood the importance of that…”
— Helen Toner (21:42) -
On Fermi’s Rapid Crash
“The only wrinkle, it happened a lot sooner than we thought.”
— Ed Elson (25:52)
Conclusion
This episode of Prof G Markets provides a clear-eyed look at how sky-high expectations and questionable contracts have left Oracle facing investor skepticism, and how the cascade of AI dealmaking is impacting the entire sector. In parallel, Trump’s sweeping executive order on AI regulation is raising both constitutional and practical questions, potentially setting the stage for continued political battles and public unease about the pace and direction of artificial intelligence in America.
The tone is incisive, skeptical, and rich in direct insight—indispensable listening for anyone tracking markets, tech leadership, or AI policy.
