WavePod Logo

wavePod

← Back to Better Offline
Podcast cover

The AI Money Trap, Part One

Better Offline

Published: Wed Aug 20 2025

Summary

Better Offline: "The AI Money Trap, Part One"

Podcast: Better Offline (Cool Zone Media & iHeartPodcasts)
Host: Ed Zitron
Date: August 20, 2025


Episode Overview

In this incisive, energetic episode, Ed Zitron launches the two-part "AI Money Trap" series—a sprawling, deeply skeptical look at the economics underpinning the generative AI industry. Through extended, unfiltered monologues, he dissects how a handful of influential startups have attracted mind-boggling venture capital and ballooned to massive valuations—despite a dearth of profits, unclear business models, and a conspicuous lack of meaningful acquisitions or exits.

Zitron positions the current "AI bubble" as Silicon Valley’s own subprime crisis: overvalued, dependent on capital inflows, and—critically—failing to answer the fundamental question, "where is the money going?" He zeroes in on case studies like OpenAI, Anthropic, and Cursor to expose paradoxical economics and systemic fragility, asking whether any of these companies can survive, let alone thrive, without endless investor largess.


Key Discussion Points & Insights

The Looming End of the AI Bubble

Timestamp: 01:54–04:00

  • Ed frames the AI bubble as structurally unsound, predicting that it will not deflate gracefully. Instead, collapse will occur through "burps and farts," with both people and innovation as collateral damage.
    • Quote: "The only way the AI bubble ends is, well, badly… there are going to be abrupt little farts, little burps as the bubble begins letting out air…" (Ed Zitron, 02:09)
  • Unlike the 2008 financial crisis, AI firms are non-essential businesses with no government safety net.

Data Center Mania & Questionable Funding

Timestamp: 04:02–06:40

  • Reputable financial outlets (WSJ, FT) have begun questioning whether the explosion in data centers is itself a bubble.
  • OpenAI’s rumored $500 billion valuation, despite "secured" funding that is not fully deployed, is called "fucking stupid."
    • Quote: "That's around the price of Netflix. This is fucking stupid. It's so stupid and every time I read about it I feel a little insane." (Ed Zitron, 05:30)
  • Huge sums are allegedly being incinerated on compute and data centers; actual destinations remain murky.

Artificially Inflated Revenues & ARR Trickery

Timestamp: 06:41–10:40

  • OpenAI and Anthropic leak "annualized recurring revenue” (ARR) numbers—based on non-standard calculations, using trailing 30-day windows rather than calendar months, creating confusion.
    • Ed’s Searing Analysis: "It's very clear OpenAI is not talking in actual calendar months, at which point we can assume they're using like a trailing 30 day window... They wouldn't have given two vastly different goddamn numbers in the same two day period." (08:15)
  • These figures are reported in ways that line up conveniently with fundraising cycles, triggering skepticism over their reality and reliability.

Cursor and Structural Fragility in Generative AI

Timestamp: 13:15–22:40

  • Cursor, an AI-powered coding environment, is presented as a microcosm of the whole sector's instability.

    • Cursor brings in substantial revenue, yet it is entirely dependent on Anthropic and OpenAI’s models—and is hit hard by abrupt price increases for access.
    • "Cursor is the weak point of the entire bubble," says Ed (13:28).
  • Upfront contracts and escalating compute costs have rendered Cursor’s business model unprofitable and fragile.

  • If Cursor fails, upstream providers like Anthropic and OpenAI lose a material chunk of revenue—unveiling systemic risks and circular dependencies.

    • Quote: "If Cursor is allowed to die, it will be unable to pay a chunk of Anthropic and OpenAI's revenue and yes, the revenue of people like Xiai and Google as well... It also brings into question whether it's possible to build… a business of any kind offering services built on top of generative AI models…" (19:17)

The AI Exit Crisis—No Buyers, No Profits, No IPOs

Timestamp: 28:26–41:40

  • Examines the near-total lack of significant acquisitions or public offerings in generative AI.
    • Only major "real" acquisition: AMD buying SiloAI ($665M, 2024).
    • Most headline-grabbing deals are crafty licensing and talent buyouts (Inflection->Microsoft, Windsurf->Google/Cognition) that leave staff and IP in limbo.
    • Quote: "There just doesn't seem to be an investor with the hunger to buy a company like Cursor valued at $9.9 billion or more if they raise another round. And you have to ask why?" (29:00)
  • Dissects insider deals in which company founders and VCs profit while employees get left behind.
    • E.g., Windsurf’s $2.4B "acquisition" hugely benefited founders/investors, while "a large portion of approximately 250 employees not benefiting from the deal." (31:55)
  • Highlights the farcical revenue numbers of highly touted AI startups—often making less than regional sports franchises.
    • Quote on Perplexity AI: "A company with 15 million users and around 550 million annualized revenue is still making less than half of the revenue… of the Cincinnati Reds baseball team. They're going to go public." (42:34)

Why Generative AI Startups Can’t Be Acquired

Timestamp: 44:30–49:10

  • Products built on LLM providers have very little unique IP and are entirely at the mercy of upstream providers (pricing, capabilities, competing products).
    • Therefore, there’s no sustainable basis for exit or profit—only talent/licensing arbitrage.
    • Quote: "Generative AI company owns very few unique things beyond their talent and will forever be at the mercy of any and all decisions that their model provider makes…" (45:36)
  • Compares AI start-up valuations to the subprime mortgage crisis: overleveraged, inflated, and with no rational exit.
    • "These startups are their VC firm, subprime mortgages, overstuffed valuations with no exit route and no clear example of how to sell them or who to sell them to." (46:57)
  • We are years in—and there isn’t a single generative AI company with consistent profits or a conventional exit.

Broader Economic and Social Implications

Timestamp: 49:00–51:00

  • The "AI money trap" may crowd out innovation elsewhere, with vast pools of capital sucked into fundamentally unsustainable business models, choking off support for more viable startups.
  • Zitron asks pointedly: "How long should we give them? Three years? Four years? An eternity?... If any of these were good businesses, they would be either profitable or be acquired in actual deals, and they would also be good businesses by now." (48:01)

Notable Quotes / Memorable Moments

  • On OpenAI’s valuation:

    "That's around the price of Netflix. This is fucking stupid. It's so stupid and every time I read about it I feel a little insane."
    (Ed Zitron, 05:30)

  • On ARR manipulation:

    "It's very clear OpenAI is not talking in actual calendar months, at which point we can assume they're using like a trailing 30-day window... They wouldn't have given two vastly different goddamn numbers in the same two-day period. It doesn't make any sense."
    (Ed Zitron, 08:15)

  • On structural risk:

    "If Cursor is allowed to die, it will be unable to pay a chunk of Anthropic and OpenAI's revenue... It also brings into question whether it's possible to build… a business of any kind offering services built on top of generative AI models…"
    (Ed Zitron, 19:17)

  • On talent/takeover deals:

    "A large portion of Windsurf's approximately 250 employees [were] not benefiting from the deal. I... I'll have links to all of this in the notes. This whole deal is real sickly... the investors got paid out, the founders got paid out, the employees got fucked."
    (Ed Zitron, 31:55)

  • On pseudo-exits:

    "The literal only liquidity mechanism outside of Cognige that generative AI has had so far is selling AI talent to big tech at a premium. Nobody has gone or is going public, and if they're not going public, the only route for these companies is to either become profitable, which they haven't, or sell to somebody, which they're not."
    (Ed Zitron, 47:10)

  • On existential dependency:

    "Generative AI company owns very few unique things beyond their talent and will forever be at the mercy of any and all decisions that their model provider makes, such as increasing prices or creating competing products."
    (Ed Zitron, 45:36)

  • On time horizon for profitability:

    "If any of these were good businesses, they would be either profitable or be acquired in actual deals, and they would also be good businesses by now. It is not early. That argument is stupid."
    (Ed Zitron, 48:01)


Segment Timestamps

  • Opening/Theme Setting: 01:54–04:00
  • Data Center Bubble & Funding: 04:00–06:40
  • Revenue Reporting Shenanigans: 06:41–10:40
  • Cursor and Generative AI Economics: 13:15–22:40
  • Acquisitions & The AI Exit Crisis: 28:26–41:40
  • Why Generative AI Can’t Be Acquired: 44:30–49:10
  • Broader Implications & Host Closing: 49:00–51:00

Language, Tone, and Style

The episode is shot through with Ed Zitron’s signature irreverence, profanity, and cutting analogies. He is unimpressed, at times furious, and unwaveringly skeptical—blending technical acumen with brash humor. Analyses are exhaustive but presented with relentless clarity, brutal asides, and emphatic repetition. Structurally, Zitron’s tone is alarmed, exasperated, and at times mocking—especially when comparing generative AIs’ finances to those of Ohio’s baseball teams ("The Reds should be interviewed by Bloomberg."). The approach is unfiltered and combative but deeply sourced.


Summary Takeaway

Ed Zitron’s "The AI Money Trap, Part One" is a rigorous, unflinching demolition of the prevailing narratives around generative AI startups' financial viability. Instead of inevitable golden futures, he sees a valley of overfunded, underperforming companies, propped up by creative accounting and misplaced investor faith, headed for a reckoning. As he looks ahead to how the AI bubble might burst, Ed challenges listeners to scrutinize tech’s future not through hype, but through numbers and exit reality—raising the urgent question: when the music stops, who will be left holding the bag?


Listen for Part Two, where Zitron promises to focus directly on OpenAI and Anthropic.

No transcript available.