Better Offline: “Exclusive: Here's How Much OpenAI Spends On Inference And Its MSFT Revenue Share”
Host: Ed Zitron
Date: November 12, 2025
Podcast: Better Offline by Cool Zone Media and iHeartPodcasts
Episode Overview
In this special episode, tech journalist and host Ed Zitron dives deep into OpenAI’s actual inference spending with Microsoft Azure and the implications of its revenue share deal with Microsoft. Drawing from exclusive documents, Zitron reveals previously unreported figures that paint a starkly different picture of OpenAI’s financial health and the costs underlying the AI boom. This episode aims to cut through tech industry hype and surface critical questions about sustainability, profitability, and transparency in the AI space.
Key Discussion Points & Insights
1. The Mystery of OpenAI’s Costs and Revenues
- Ed Zitron sets the stage: For years, public reporting on OpenAI’s financials has been piecemeal and often internally inconsistent. Zitron's reporting is based on documents reviewed for his newsletter and corroborated (as far as possible) with sources, with Microsoft and OpenAI declining to comment.
- Clarification of terms:
- Inference: The process by which AI models generate output in response to prompts, distinct from the massive initial training cost.
2. OpenAI’s Inference Spending on Microsoft Azure
-
Jaw-dropping spend:
- “According to documents viewed by my newsletter, OpenAI spent $5.02 billion on inference alone with Microsoft Azure in the first half of calendar year 2025.” (03:10)
- By September 2025: “OpenAI had spent $8.67 billion just on inference.” (03:23)
-
Steep cost acceleration:
- 2024 Inference spend: $3.76 billion.
- Just the first nine months of 2025: already more than doubled at $8.67 billion.
-
Comparison with previous reporting:
- “According to The Information, OpenAI’s compute to run models — which I understand to mean inference — was $2 billion in 2024.” (04:04)
- Projections in media were significantly lower than real costs evidenced here.
3. OpenAI’s Revenue Share with Microsoft
- The deal: Microsoft receives 20% of OpenAI’s revenue, on top of whatever is spent on compute/inference via Azure.
- “In simple terms ... Microsoft receives 20% of OpenAI’s revenue in addition to whatever it spends on GPUs and servers.” (01:39)
- Numbers from the documents:
- 2024: Microsoft got $493.8 million in revenue share from OpenAI — implying OpenAI’s revenues at minimum $2.469 billion.
- This is “around $1.23 billion less than the $3.7 billion number that's been previously reported in multiple outlets.” (05:23)
- First half of 2025: $454.7 million to Microsoft (implies $2.273 billion for OpenAI) — about $2 billion less than widely reported ($4.3B).
- “Through September, Microsoft’s revenue share payments totaled $865.8 million, implying OpenAI’s revenues are at least $4.329 billion through the end of Q3.” (06:00)
4. Why Do the Numbers Not Add Up?
- Possible other revenue streams:
- Microsoft gives OpenAI a cut of Bing’s revenues under certain conditions, and 20% from all OpenAI models sold through Azure — but the details are murky.
- Zitron: “I’m skeptical that they can account for the massive difference ... I do not know, nor will I speculate on why these differences are so distinct.” (06:30)
- Implication: There’s a gulf between the numbers provided by the documents and previous leaks/media coverage.
5. Ramifications for OpenAI and the AI Industry
- Financial sustainability in question:
- “OpenAI's costs are dramatically higher than previously reported and thought. And based on the extrapolations from Microsoft’s revenue share, its implied revenues are also seemingly dramatically lower than we knew.” (07:01)
- Profitless growth at scale:
- As OpenAI’s revenue increases, so does its inference spend — a “hamster-wheel” situation that could make true profitability as elusive as ever.
- “Every increase in ChatGPT's user numbers ... so does their inference costs.” (07:17)
- The “AI bubble” feels increasingly precarious:
- “Inference, the process of creating outputs for a model, appears to be an incredibly burdensome cost. And if these implied revenues are any indicator, the actual business of selling generative AI services and models doesn’t really seem to be as good a business as we thought.” (07:33)
- Zitron refrains from his usual sarcasm and critiques, noting the gravity of the situation:
- “These numbers are serious and seriously different to those reported. … It's all looking a little bleak out there.” (07:13)
- Expresses uncertainty about OpenAI’s future and the sector as a whole.
Notable Quotes & Memorable Moments
-
On the shocking inference spend:
- “OpenAI's inference costs have risen consistently over the past 18 months too. … OpenAI has already more than doubled its inference costs in just the first nine months of 2025.” (03:36)
-
On flawed public perceptions:
- “The ramifications of these numbers are severe. OpenAI’s inference costs are incredibly high, absorbing any and all revenues and seemingly scaling with every increase in ChatGPT’s user numbers.” (07:17)
-
On the sustainability of AI as a business:
- “The reality of the AI bubble is becoming clearer. … The actual business of selling generative AI services and models doesn’t really seem to be as good a business as we thought either.” (07:33)
-
On the ethical responsibility of reporting:
- “I don't want to editorialize too much because I want this information to sit on its own self, but it's ... It's strange being here. It's strange getting these numbers and seeing them myself and I have to wonder how things work out from here.” (07:48)
-
Closing thoughts on transparency and the industry:
- “These numbers allow us to kind of see the real picture of the AI bubble. And I have to wonder what other companies look like now that I've seen these numbers. Email me, contact me ezitron76 on Signal if you ever want to tell me anything.” (08:00)
Timestamps for Key Segments
- [01:09] – Ed Zitron introduces the episode and sets expectations
- [03:10-03:40] – Breakdown of OpenAI’s actual inference costs
- [04:00-04:27] – Comparisons with previous industry estimates
- [05:10-06:15] – OpenAI’s revenue share numbers and implications for total revenue
- [07:01-07:45] – Discussion of the emerging “AI bubble” and existential questions for OpenAI
- [08:00] – Call for industry whistleblowers and wrap-up
Tone & Style
Ed Zitron adopts a notably serious, almost urgent tone, in contrast with his usual sardonic style. He makes a conscious choice to report the facts plainly, letting the numbers speak for themselves, while still acknowledging the gravity and the wider ramifications for the tech industry.
Summary Takeaways
- OpenAI’s inference costs are much higher, and its revenues significantly lower, than has been widely reported.
- The current financial trajectory may not be sustainable, raising tough questions about the broader viability of today’s AI business models.
- If these numbers are indicative, the “AI bubble” could burst under the weight of its operating costs.
- Zitron underscores the need for transparency and invites further whistleblowing to help the public understand what’s really happening inside the world’s most powerful AI labs.
