Transcript
A (0:00)
This is an iHeart podcast.
B (0:04)
There's nothing like sinking into luxury. Anabe sofas combine ultimate comfort and design at an affordable price. Annabe has designed the only fully machine washable sofa from top to bottom. The stain resistant performance fabric slipcovers and cloud like frame duvet can go straight into your wash. Perfect for anyone with kids, pets or anyone who loves an easy to clean spotless sofa With a modular design with changeable slipcovers, you can customize your sofa to fit any space and style. Whether you need a single chair, loveseat or a luxuriously large sectional, Anna Bay has you covered. Visit washablesofas.com to upgrade your home. Sofas start at just $699 and right now get early access to Black Friday savings up to 60% off storewide with a 30 day money back guarantee. Shop now@washablesofas.com Add a little to your life. Offers are subject to change and certain restrictions may apply.
C (1:09)
Hello and welcome to a very special episode of Better Offline. I'm of course your host Ed Zitron. Better Offline for years I've been hunting down the core details behind OpenAI's costs and revenues, and today I'm going to bring you some of them. A lot of what I say today is going to be reflected in my newsletter, which I'll link to in the notes. Based on documents viewed by my newsletter, I'm able to report OpenAI's inference spend on Microsoft Azure in addition to its payments to Microsoft as part of its 20% revenue share agreement, which was reported in October 2024 by the Information. In simple terms, that last bit means that Microsoft receives 20% of OpenAI's revenue in addition to whatever it spends on GPUs and servers. As a reminder, inference is the process through which a model creates an output, which I'll be reminding you of a few times because it's actually kind of important. Now, a few notes. I don't have OpenAI's training spend, nor do I have information on the entire extent of OpenAI's revenues, as it appears that Microsoft shares some percentage of its revenue from Bing, as well as 20% of the revenue Microsoft receives from selling OpenAI's models on Azure. What I do have, as I've mentioned, is its inference spend. And if you're new to this, like I said, this means all the computations OpenAI does when processing requests sent to its services like ChatGPT and Sora. Now, before publishing, I asked the Financial Times reporter to help corroborate some of the data in the documents. They reached out to Microsoft and OpenAI, who both declined to comment. Now, the following will be a lot of numbers and it might be easier for you to read them. However, I'm going to try and make things as easy and clear as possible because the documents I've seen call into question what we actually knew about OpenAI's business and the sustainability of said business. To keep things simple, all the years in this piece are calendar years. Microsoft has fiscal years. I'm not going to play that game. It's impossible to follow along with. Nobody thinks this way anyway. Now we've done that, let's get to him According to the documents viewed by my newsletter, OpenAI spent $5.02 billion on inference alone with Microsoft Azure in the first half of calendar year 2025. This is a pattern that has continued through the end of September 2025, by which point OpenAI had spent 8.67 billion do just on inference. OpenAI's inference costs have risen consistently over the past 18 months too. For example, OpenAI spent $3.76 billion on inference in 2024, meaning that OpenAI has already more than doubled its inference costs in just the first nine months of 2025. These costs are dramatic and significantly higher than has been previously reported. According to the information, OpenAI's compute to run models, which I understand to mean inference, was $2 billion in 2024. Addition five piece from the Information stated that OpenAI's inference costs for 2025 would be around $6 billion, or roughly $2 billion less than OpenAI appears to have spent through the end of September. I want to be clear as well. I'm just reporting what these documents have said. This is not a statement about the information. They do great reporting. But then there's the issue of the revenue share. As I've previously stated, the following numbers are based on the revenue share paid to Microsoft as part of its deal with OpenAI, where it gives Microsoft 20% of its revenues. According to the documents, Microsoft received $493.8 million in revenue share payments in 2024 from OpenAI, implying revenues for 2024 for OpenAI of at least $2.469 billion, or around $1.23 billion less than the $3.7 billion number that's been previously reported in multiple outlets. Similarly, for the first half of 2025, Microsoft received $454.7 million as part of its revenue share agreement, implying OpenAI six month period were at least $2.273 billion, or around $2 billion less than the $4.3 billion previously reported for that period. Through September, Microsoft's revenue share payments totaled $865.8 million, implying OpenAI's revenues are at least $4.329 billion through the end of Q3 2025. To be clear, and I'm going to say this, Microsoft also pays OpenAI a cut of Bing's revenues under certain circumstances I could not confirm, as well as a cut of about 20% of all OpenAI models sold through Azure. Just to be clear, Microsoft is the only party that can sell OpenAI's models other than OpenAI. I don't have the details on those payments, like I said, but I'm skeptical that they can account for the massive difference between those numbers that have been leaked and the ones in the documents in question. I do not know, nor will I speculate on why these differences are so distinct. What was important about today was getting you these numbers and shedding light on the differences I see between the story told about OpenAI and the reality of its spend and potential revenues. You've also probably noticed that this podcast has a bit of a different tone to the usual no insults, no jokes, haven't called anyone clammy, haven't even said a swear word for the first time in maybe 100 episodes. The reason's simple. These numbers are serious and seriously different to those reported. OpenAI's costs are dramatically higher than previously reported and thought. And based on the extrapolations from Microsoft's revenue share, its implied revenues are also seemingly dramatically lower than we knew. The ramifications of these numbers are severe. OpenAI's inference costs are incredibly high, absorbing any and all revenues and seemingly scaling with every increase in ChatGPT's user numbers. As revenue goes up, so does their inference costs. Conversely, if these implied revenues are indicative of the larger financial picture, OpenAI is not as successful a company as we previously believed. In any case, the reality of the AI bubble is becoming clearer. Inference, the process of creating outputs for a model appears to be an incredibly burdensome cost. And if these implied revenues are any indicator, the actual business of selling generative AI services and models doesn't really seem to be as good a business as we thought either. It's all looking a little bleak out there. I don't want to editorialize too much because I want this information to sit on its own self, but it's it's strange being here. It's strange getting these numbers and seeing them myself and I have to wonder how things work out from here. I truthfully have no idea. But I do know I'll be happy to do this every week and I will tell you what happens now. These numbers allow us to kind of see the real picture of the AI bubble. And I have to wonder what other companies look like now that I've seen these numbers. Email me, contact me ezitron76 on Signal if you ever want to tell me anything. If you ever want to show me anything, I'm always interested to hear. And I'm honored to do this.
