Loading summary
A
In a world of economic uncertainty and workplace transformation, learn to lead by example from visionary C Suite executives like Shannon Schuyler of PwC and Will Pearson of iHeartMedia, the Good Teacher explains the great teacher inspires.
B
Don't always leave your team to do the work. That's been the most important part of how to lead by example.
A
Listen to Leading by Example executives making an impact on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
B
Cool Zone Media. Try spinning. That's a good trick. Better I'm Ed Zitron. This is Better Offline and this is the second episode of my two part series where I explain how OpenAI has become a systemic risk to the tech industry even with its massive $40 billion funding round and Bird Brain benefactor in the form of SoftBank, the world's foremost authority in losing money. Now, before I continue shameless request, Better Offline has been nominated for a Webby and I want to win this thing. I've linked to it in my Twitter and my blue sky and if you could vote for me, well, it'll be in the episode notes too. Moving on, back at it. Okay, all right. OpenAI now has $40 billion somehow, right? Great. Right. Well, hold your horses. As part of its deal with SoftBank, OpenAI must also convert its bizarre nonprofit structure into a for profit entity by December 2025 or it will lose $10 billion from that $40 billion round of funding. And just to be clear, by the way, they've only really got $10 billion of that so far. The rest is at the end of the year. Furthermore, in the event that OpenAI fails to convert into a for profit company by October 2026, investors in its previous $6.6 billion funding round can claw back their investment with it. Convert with an attached interest rate. Naturally, this represents a nightmare scenario for the company, as it will increase both its costs and its outgoings. This is a complex situation that almost warrants its own podcast, but the long and short of it is that OpenAI would have to effectively dissolve itself, start the process of reforming an entirely new entity, and distribute its assets to other nonprofits, or sell or license them to a for profit company at fair market rates which they would not set. It would also require valuing OpenAI's assets, which in and of itself would be a difficult task, as well as getting past the necessary state regulators, the irs, state revenue agencies, and the upcoming trial with Elon Musk. Well, that only adds further problems. I've simplified things here and that's because as I've said, this stuff is a little complex and pretty boring. Suffice to say, this isn't as simple as liquidating a company and starting afresh or submitting a couple of legal filings. It's a long fraught process and one that will be as has been subject to legal challenges both from OpenAI's business rivals as well as from civil society organizations in California. You may have heard the lost monologue based on discussions with experts in the field and my own research. I simply do not know how OpenAI pulls off this by October 2026. And honestly, I'm not sure how they do it by the end of this year. It's insane. It's a, it's a really. I just, every time I read this stuff and I write it, I'm like, how is nobody else reading this and going, what the fuck is going on? You see, this is a big problem, this nonprofit thing, because OpenAI really has become a systemic risk to the tech industry. And anything that increases that risk is bad news for everybody. OpenAI, they've become a kind of load bearing company for this industry, both as a narrative, as I've discussed multiple times. As ChatGPT is the only large language model company with any meaningful use base, and also as a financial entity, its ability to meet its obligations and its future expansion plans are critical to the future health or in some cases survival of multiple large companies. And that's before the after effects that will affect its customers as a result of any kind of financial collapse. The parallels to the 2007 and 2008 financial crisis are starting to become a little worrying. Lehman Brothers wasn't the largest investment bank in the world, although it was pretty big. Just like OpenAI isn't the largest tech company, though again, it's certainly large in terms of alleged valuation and expenditures. Lehman Brothers collapse sparked a contagion that would later spread throughout the entire global financial services industry and consequently the global economy. Now I can see OpenAI's failure not having as big an effect, but I can imagine a systemic effect still. You have to realize that the whole AI trade, the narrative, the bubble, it's holding up the economy. I think like 30, 35% of the US stock market is in the Magnificent Seven. And all of their bullshit numbers right now are held up by this nonsense. And like the financial crisis, the impact in this case won't be limited to just bankers and insurers. It will bleed into everything else. This episode is going to be a bit grim. I'm not going to Lie. I want to lay out the direct result of any kind of financial crisis at OpenAI because I don't think anybody is taking this seriously. Let's start with Oracle, who will lose at least a billion dollars if OpenAI doesn't fulfill its obligations per the information. Oracle, which has taken responsibility for organizing the construction of the Stargate data centers with unproven data center builder Crusoe and I quote the information here, may need to raise more capital to fund its data center ambitions. Oracle has signed a 15 year lease with Crusoe and to Qu, the information is on the hook for $1 billion in payments to that firm. To further quote the information, while that's the standard deal length, the unprecedented size of the facility Oracle is building for just one customer makes it riskier than a standard cloud data center used by lots of interchangeable customers with much more predictable needs, according to half a dozen people familiar with these types of deals. In simpler terms, Oracle is building a giant data center for one customer, OpenAI, and has taken on the financial burden associated with it. If OpenAI fails to expand or lacks the capital to actually pay for its share of the Stargate data center, Oracle is on the hook for at least a billion dollars. And based on the information's reporting, it's also on the hook to buy the GPUs for the site. This is me quoting them again. Even before the Stargate announcement, Oracle and OpenAI had agreed to expand their Abilene deal from two to eight data center buildings which can hold 400,000 Nvidia Blackwell GPUs, adding tens of billions of dollars to the cost of the facility. In reality, this development will likely cost tens of billions of dollars, 19 billion of which is due from OpenAI, which does not have the money until it receives its second tranche of funding in December 2025 from bank. And this is contingent partially on their ability to convert into a for profit entity, which as mentioned is extremely difficult and extremely unlikely. It's unclear how many of the Blackwell GPUs that Oracle has had to purchase in advance, but in the event of any kind of financial collapse at OpenAI, Oracle will likely have to toss at least a billion dollars, if not several billion dollars. And then we get the Core Weaver Company. So a company whose expansion is likely driven entirely by OpenAI now and cannot survive without OpenAI fulfilling its obligations if it doesn' Anyway, now I've written and spoken a lot about publicly traded AI compute firm Core Weave, and it would give me the greatest pleasure in my life to never think or talk about them ever again. Nevertheless, I have to. This is my curse. This is my curse. Core Weave has become my curse. Every time I think about this. Fuck. Okay. The Financial Times revealed a few weeks ago that Corweave's debt payments could balloon to over $2.4 billion a year by the end of 2025, far outstripping its cash reserves. And the information reported that its cash burn would increase to $15 billion. In as per its IPO filing, 62% of Coreweave's 2024 revenue, a little under 2 billion, with losses amounting to 863 million, was Microsoft compute. And based on the conversations I've had with sources, a good amount of this was Microsoft Running Compute for OpenAI. Starting October 2025, OpenAI will start paying Coreweave as part of its five year long $12 billion contract, picking up the option that Microsoft declined. This is not great timing, or maybe it's perfect timing because this is also when coreweave will have to start making payments on their massive multibillion dollar DDTL 2.0 loan mentioned in previous episodes. But really, there's a newsletter if you want to. You hear me go mad, you want to read me go mad, you read my, read my Core Weave piece because it really drove me insane. Nevertheless, these Core Weave payments, the ones from OpenAI to Core Weave that October, they're pretty much critical to Core Weave's future. This deal also suggests that OpenAI will become core Weave's largest customer. Microsoft had previously committed to spending $10 billion on Core Weave services by the end of the decade. But CEO Satya Nadella added a few months later on a podcast that its relationship with Core Weave was a one time thing. Man, it really like really fucking around there. Sachet. Sachet don't love Core Weave. Assuming Microsoft keeps spending at its previous rate, so about one point like 66% of $2 billion, whatever that is, something that isn't guaranteed. By the way, it would still only be half of OpenAI's potential revenue to Core Weave. Core Weave's expansion at this point is entirely driven by OpenAI. 77% of its 2024 revenue came from two customers, Microsoft being the largest. And yes, I just fucked up a number at 62%. And using Core Weave as auxiliary compute for OpenAI. As a result, the future expansion efforts, the theoretical 1.3 gigawatts have contracted. And by the way, that means it doesn't exist. COMPUTE at Core Weave are largely, if not entirely for the benefit of OpenAI in the event that OpenAI cannot fulfill its obligations, Core Weave will collapse. It's that fucking simple. And then the shock waves will ripple further. Nvidia relies on Core Weave for more than 6% of its revenue and Core Weave's future credit worthiness to continue receiving said revenue. Well, much of that is dependent on OpenAI continuing to buy services from Core Weave. Now, I'm basing this on a comment I received from the legendary Gil Luria, managing director and head of technology research at analyst D.A. davison & Company. I quote him Since Core Weave bought 200,000 GPUs last year and those systems are around $40,000, we believe Core Weave spent $8 billion on Nvidia, represents more than 6% of Nvidia's revenue in 2024. He said last year, but I just wanted to make it sound better. Core Weave receives preferential access to Nvidia's GPUs, though Nvidia kind of denies that and makes up billions of dollars of Nvidia's revenue. Core Weave then takes those GPUs and then they raise debt using the GPUs as collateral as well as customer contracts. Then they use the money they've raised to buy more GPUs from Nvidia. You may think that doesn't sound right. I am being completely like this is factual information at this point. Nvidia was the anchor for Core Weave's ipo, and CEO Michael Entreta said that the IPO would not have closed without Nvidia buying $250 million worth of their shares. Nvidia also invested $100 million in the early days of Core Weaves and for reasons I cannot understand, also agreed to spend $1.3 billion over four years too. And I quote the information rent its own chips from coreweave. Fun fact. I can't find a single fucking mention of Core Weave in any of Nvidia's filings. Now, buried in Coreweave's S1, the document every company publishes before going public was a warning about counterparty credit risk, which is when one party provides services or goods to another with specific repayment terms and the other party doesn't meet their side of the deal. While this was written as a theoretical as it could in theoretically speaking come from any company to which Coreweave acts as a creditor, it only named one OpenAI. Now, as discussed previously, Coreweave is saying that should a customer, any customer, but really they mean OpenAI fail to pay its bills for infrastructure built on their behalf or services rendered, it could have a material risk to the company. Now, as an aside, the information reported that Google and someone's going to email me this, so I just want to get ahead of it. The Core Weave is apparently in advanced talks with Google to rent GPUs. It also it also added another thing in this story, just so I don't have to hear from any of you that Google's potential deal with Core Weave is significantly smaller than their commitments with Microsoft, as according to one of the people briefed on it, but could potentially expand in future years. Do not come to me and claim that Google is going to save Core Weave. I'll be so mad. Anyway, even with Google and OpenAI's money, Core Weave's continued ability to do business hinges heavily on its ability to raise further debt, which I have previously called into question in a newsletter that gave me madness. And its ability to raise future debt is, to quote the Financial Times, secured against its more than 250,000 Nvidia GPUs and its contracts with customers such as Microsoft. Now, any Future debt that CoreWeave raises will be based off of its contract with OpenAI. You know, the counterparty credit risk threat that represents a disproportionate share of its revenue I just mentioned. And also whatever GPUs they still have left that they can get debt on. As a result, A chunk of Nvidia's future revenue is dependent on OpenAI's ability to fulfill its obligations to Core Weave, both in its ability to pay them and their timeliness in doing so. If OpenAI fails, then CoreWeave fails, then that hurts Nvidia. Jensen's gonna have to go. He's gonna have to go to a cheaper leather jacketarium. And it gets worse. OpenAI's expansion is dependent on two unproven startups, one of them I just mentioned, who are also dependent on OpenAI to live with Microsoft's data center pullback and OpenAI's intent to become independent from Redmond. Future data center expansion is based on two partners supporting Core Wave. I know, we'll get there. And Oracle. Now, I'm referring of course to Core Scientific, which is the data center for coreweave, and of course Crusoe, who's the data center developer for Oracle. Now, if you were wondering, I kind of hinted about this earlier. How many data center, how many AI data centers do you think Crusoe's ever built? And the answer is none. And Core Scientific, how many do you think they've built? And the answer is Also none. These are the fucking companies underpinning the AI boom. I also really must explain how difficult it is to build a data center and how said difficulty increases when you're building an AI focused one. For example, Nvidia had to delay the launch of its Blackwell GPU because of how finicky the associated infrastructure. So the servers and the cooling and such is for customers. This was for customers that had already been using GPUs and therefore likely knew how to manage the temperatures created by them. Also, as another reminder, OpenAI is on the hook for $19 billion of funding behind Stargate and neither of them have that money. I just want to remind you of that because it costs so much money to build a fucking data center. And imagine if you didn't have any experience and effectively had to learn from scratch. How do you think it would be building these data centers? Let's find out. So let's start in Abilene, Texas with Crusoe and the Stargate data center project. Now, Crusoe is a former cryptocurrency mining company that has now raised hundreds of millions of dollars to build Data centers for AI companies, starting with a $3.4 billion data center financing deal with asset manager Blue Owl Capital. This yet to be completed data center has now been leased by Oracle, which will allegedly fill it full of GPUs for OpenAI. Despite calling itself, and I quote, the industry's first vertically integrated infrastructure provider, with the company using flared gas as a waste byproduct of oil production to power IT infrastructure, Crusoe does not appear to have built a single AI data center and is now being tasked with building 1.2 gigawatts at a data center capacity of capacity for OpenAI. It's just so fucking Crusoe is the sole developer and operator of the Abilene site, meaning, according to the information, that it is in charge of contracting with construction contractors and data center customers, as well as running the data center after it is built. Oracle, it seems, will be responsible for filling said Data center with GPUs, as mentioned. Nevertheless, the project also appears to be behind schedule. The Information reported in October 2024 that Abilene was meant to have 50,000 of Nvidia's Blackwell AI chips in the first quarter of 2025, and also suggested that the site was projected to have a whopping 100,000 of them by the end of 2025. Now you can join me back here in reality because a report From Bloomberg in March 2025 said that OpenAI and Auroracle were expected to have 16,000 available by the the summer of 2025 with, and I quote, OpenAI and Oracle expecting to deploy 64,000 Nvidia GB2 hundreds at the Stargate data center by the end of 2026. That's, that's, that's very delayed. That's really delayed. Again, how? I run a PR firm in the. I record a podcast, I write a newsletter, I have a book I'm writing. I got all this shit on and I'm the asshole who notices this. Anyway, as discussed previously, OpenAI needs this capacity very badly. According to the Information, OpenAI expects Stargate to handle 3/4 of its compute by 2030. And these delays call into question at the very least whether this schedule is reasonable or logical or even possible. And I actually really question whether Stargate itself is possible at this point. But it can get dumber because we're about to talk about Core Scientific and they are coreweave's friends. They're the people building data centers for corewave in Denton, Texas. Now as you can probably tell, I've written a great deal about Corwave in the past. I got a monologue, got a newsletter, and I got a therapy bill for it. And specifically I've written about their build out partner, Core Scientific, a cryptocurrency mining company. Yes, another one that has exactly one customer for its AI data centers. And you'll never guess who it is. It's coreweave. Now here's a few fun facts about Core Scientific. Core Scientific was bankrupt last year. Core Scientific has never built an AI data center and its cryptocurrency mining operations were built around ASICS specialist computers for mining Bitcoin. Which led to an analyst to tell CNBC that said data centers would, and I quote, need to be bulldozed and built from the ground up to accommodate AI compute. That's the stuff. Core Scientific also does not appear to have any meaningful AI compute of any kind. It's AI hpc, which is high performance computing revenue represents a teeny, tiny, teeny little percentage of overall revenue which mostly comes from mining crypto, both for itself and other parties. Now hearing all of this, would you give this company your compute? Would you think these are the people that I am going to call to build my data centers? If you said no, you are smarter than Corweave, who has given their entire 1.3 gigawatt build out to Core Scientific. Core Scientific also, it seems they seem to be taking on like $1.14 billion of capital expenditures to build these data centers, which by the way, is not enough money, but nevertheless, Core Weavers promised to reimburse them 899.3 million of these costs. This is all from public filings, by the way. It's also unclear how Core Scientific actually intends to do any of this shit. While they've taken on a good amount of debt in the past $550 million in a convertible note towards the end of last year, this would be more debt than they've ever taken on. It also, as with Crusoe, does not appear to have any experience building AI data centers, a point I keep repeating because it's very important. These are the companies behind the growth for OpenAI. Except unlike Crusoe, Core Scientific is a barely functioning, recently bankrupted bitcoin miner pretending to be a data center company. Crusoe, on the other hand, is possibly also doing the same thing, but they're less egregious about it now. How important do you think Coreweave is to OpenAI exactly? Well, that's our semaphore. Core Weave has been one of our earliest and largest compute partners, OpenAI chief Sam Altman said in Core Weave's Roadshow video, adding that Core Weave's compute power led to the creation of some of the models that we're best known for. Coreweave figured out how to innovate on hardware, to innovate on data center construction, and to deliver results very, very quickly. Did it? But even if it did, will it survive long term? Going back to the point of the contagion, if OpenAI fails and core Weave fails, so too does Core Scientific. And I don't really fancy Crusoe's chances either. But let's take a step back for a moment. We've been going so hard, haven't we? I've got a genuine question just for the fact finders out there, Does Microsoft book OpenAI's computers revenue now? Up until fairly recently, Microsoft has been the entire infrastructure backing OpenAI, but recently, to free OpenAI up to work with Oracle and see other people, released it from its exclusive cloud compute deal. Nevertheless, per the information, OpenAI still intends to spend $13 billion on compute on Microsoft Azure this year. What's confusing, however, is whether any of this is booked as revenue for Microsoft. Microsoft claimed earlier in the year that it surpassed $13 billion in annual recurring revenue, by which it means it's last month multiplied by 12, by the way, and they said it was from AI. OpenAI's compute costs in 2024 were $5 billion, and that's at a discounted Azure rate, which on an annualized basis would be about $416 million in revenue a month for Microsoft. It isn't, however, clear whether Microsoft counts OpenAI's computers money, which is really fucking weird. You'd think with all this money they're making from this company, they'd be saying there was money coming in. It's peculiar. I've yet to find a real answer. Now, Microsoft's earnings do not include an artificial intelligence section. No, they're made up of three separate segments, productivity and business processes, which includes things like LinkedIn, Microsoft 365 and so on. More personal computing, which includes Windows and gaming products, and then Intelligent cloud, including server products and cloud services like Azure, which is likely where OpenAI's computer is included and where Microsoft booked the revenue from selling access to OpenAI's models but not OpenAI's compute question mark. As a result, it's hard to say specifically where OpenAI's revenue might sit. Even guessing, Intelligent Cloud might not be right. But based on an analysis of Microsoft's Intelligent Cloud segment From financial year 2023 Q1 through its most recent earnings and there was a spike in revenue from 23Q1 to 24Q1 in financial year Q1, which ended on September 30, 2022, a month before ChatGPT's launch, the segment made $20.3 billion. The following year in FY24Q1, it made $24.3 billion and 19.7% year over year growth, or roughly $4 billion. This could represent the massive increase in training and inference costs associated with hosting ChatGPT, and they peaked at 28 point billion in revenue in the financial year 24Q4 before dropping dramatically to $24.1 billion in financial year 25Q1 and raising a little $25.5 billion in financial year 25Q2. I'm so sorry, none of this is easy to read. This is a plausible explanation. OpenAI spent 2023 training its GPT4O model before transitioning to its massive, expensive Orion model, which would eventually become GPT 4.5, as well as its video generation rating model Sora. According to the Wall Street Journal, training GPT 4.5 involved at least one training run, costing around half a billion dollars in compute costs alone. These are huge sums, but it's worth noting a couple things. First, Microsoft licenses OpenAI's models to third parties, so some of this revenue could be from other companies using GPT on Azure, we've seen lots of companies launch AI products, and not all of them are based on LLMs. Muddying things further, Microsoft provides OpenAI access to Azure cloud services at a discounted rate, as I've mentioned in and so there's a giant question mark over OpenAI's actual contribution to the various spikes in revenue for Microsoft's intelligent cloud segment or whether other third parties played a significant role. Furthermore, Microsoft's investment in OpenAI isn't entirely in cold hard cash. Rather, it's provided the company with credits to be redeemed on SEO services. Kind of like Chuck E. Cheese tokens. I'm not entirely sure how this would be represented in accounting terms, and if anyone can shed any light on this, please get in touch. Would it be noted as revenue or something else? OpenAI isn't paying Microsoft, or are they? Are they doing the tech equivalent of redeeming air miles? Or have they spent a gift card of Azure? It really isn't obvious and Microsoft is doing some accounting bullshit here. I'm not suggesting impropriety, not suggesting anything illegal. I'm just saying it's insane that they have this company spending billions of dollars theoretically on their services, and it's just nowhere. Additionally, while equity is often treated as income for tax purposes, as is the case when an employee receives RSUs as part of their compensation package under the existing OpenAI structure, Microsoft isn't actually a shareholder, but rather the owner of profit sharing units. This is a distinction worth noting. These profit sharing units are treated as analogous to equity, or at least in terms of OpenAI's ability to raise capital. But in practice they aren't the same thing. They don't represent ownership in a company as directly as, for example, a normal share would. They lack the liquidity liquidity of a share and the upside they provide, namely dividends, is purely theoretical. Another key difference. When a company goes bankrupt and enters liquidation, shareholders can potentially receive a share of the proceeds after creditors, employees, and so on are paid. While that often doesn't happen as is as in the liabilities, generally they can exceed the assets of the company. In many cases, it's at least theoretically possible. Given that profit sharing units aren't actual salaries or shares, where does that leave Microsoft? This stuff is confusing and I'm not ashamed to say that I just fucked up a word and that complicated accounting questions like these are far beyond my understanding. If anyone can shed some light, drop me an email, buzz me on Twitter or Bluesky, hit me up on Plurk or Gaup or post on the better offline subreddit. Someone might take your wallet though. Anyway, back on track. I think it's worth understanding the scale of the OpenAI vortex and how it's distorting the tech investment market and why even without having failed, it represents systemic risk risk. Without OpenAI, an American startup investment is flat and even with it, less startups are receiving investment Crunchbase News reported in early April that North American startup Investment spiked in Q1 due to OpenAI hitting $82 billion. Great, right? Sounds great. This statement sadly has a darker undertone. American startup investment was actually like 42 billion in Q1 2025 when you removed the deal, which is appropriate because none of the money is actually received by OpenAI yet, and at best only $10 billion of it will be received before December 2025. This quarter included a $3.5 billion investment in Worm like competitor Anthropic run by Wario Amadei, making the appropriate number a paltry $39.5 billion. Now this is still an improvement, though a marginal one over the $37.5 billion raised in Q1 2024. Nevertheless, Crunchbase News also has a far, far darker story. Deal volume in American startups has begun to collapse, trending downward almost every quarter. Or deal volume isn't a direct result of OpenAI's financial condition. The so called revolution created by OpenAI and other generative AI companies. Technology appears to be petering out and the contagion is starting to impact the wider tech sector. It's important to understand how bleak things are. The future of generative AI rests on OpenAI and OpenAI's future rests on near impossible financial requirements. I've done my best to make this argument in as objective a tone as possible, regardless of my feelings about the bubble and its associated boosters. OpenAI as I've said before an arguably countless times in interviews and podcasts and newsletters, it's effectively the entire generative AI industry, with its nearest competitor being less than 5% of its 500 million weekly active users. Anthropic, Google, Microsoft, XAI. They're all rounding errors in the grand scheme of things. But OpenAI's future is dependent. And this is not an opinion, this is an objective fact on effectively infinite resources in many forms. Let's start with the financial resources. If OpenAI required $40 billion to continue operations this year, it's reasonable to believe it will need at least another $40 billion next year and based on its internal projections, will need at least $40 billion every single year until 2030 when it claims somehow it will be profitable and I quote the information with the completion of the Stargate data center project, you may be wondering how's that possible, Ed? How you think the information wrote that down? Fuck no. Jessica Lesson's too busy humiliating people she let go by name on Twitter. Jessica Lesson, I like the information. I think you're a fucking asshole for how you treated your people. Say it on my podcast and say it on Twitter. Anyway, let's keep talking about some of these resources that OpenAI is dealing with, specifically the compute resources and expansion. OpenAI requires more compute resources than anyone has ever needed, and will continue to do so in perpetuity. Building these resources is now dependent on two partners, Course Scientific and Crusoe, though I've never built a data center as Microsoft has materially pulled back on data center development and has as aforementioned, pulled back on 2 gigawatts of data centers slowed or paused. Of course, some of its early stage data center projects too, with TD Cohen's recent Analysis Analyst report saying that data center pullbacks were and I quote them March 26, 2025 Data Center Channel checks letter because it's so good Driven by the decision to not support incremental OpenAI training workloads. That's the start. In simpler terms, OpenAI needs more compute at a time when its lead backer, which has the most GPUs in the world, has specifically walked away from building it. Even in my most optimistic frame of mind, it isn't realistic to believe that Crusoe or Core Scientific can build the data centers necessary for OpenAI's expansion. Even if SoftBank and OpenAI had the money to invest in Stargate today, which they do not, dollars do not change the fabric of reality. Data centers take time to build, requiring concrete, wood, steel and other materials to be manufactured and placed, and that's after the permitting required to get these deal done. Even if that succeeds, getting the power necessary is a challenge unto itself, to the point that even Oracle, an established and storied cloud compute company run by a very evil man at one point, to quote the information, has less experience than its larger arrivals in dealing with the utilities to secure power and working with powerful and demanding cloud customers whose plans change frequently. A partner like Crusoe or Core Scientific simply doesn't have the muscle memory or domain expertise that Microsoft has when it comes to building and operating data centers. As a result, it's hard to imagine, even in the best case scenario, that they're able to match the hunger for compute that OpenAI has now. I want to be clear. I believe OpenAI will still continue to use Microsoft's Compute and even expand further into whatever remaining compute Microsoft may have. However, there is now a hard limit on how much of that there's going to be, both literally in what's physically available and in what Microsoft itself will actually allow OpenAI to use. Especially given how unprofitable GPU COMPUTE seems to be based on how every single company that isn't Nvidia loses money running them. But really and we're coming to the end of this which leads to a question. How does all of this end? Last week a truly offensive piece of fanfiction framed as a report called AI 2027 went viral, garnering press with Dwarkesh podcast and gormless childlike wonder from dope New York Times reporter Kevin Roose and reporter I think is a fucking stretch. Its predictions vaguely suggest a theoretical company called Open Brain will invent a self teaching agent of some sort. It's total bullshit, but it captured the hearts and minds of AI boosters and other people without object permanence because it vaguely suggests that somehow large language models and their associated technology will become something entirely different. Different. I don't like making predictions like these because the future, especially in our current political climate, is utter chaos. But I will say that I do not see, and I say this with complete objectivity, how any of this bullshit continues. I want to be extremely blunt with the following points as I feel like both members of the media and tech analysts have categorically failed to express how ridiculous things have become. I will be repeating myself, but it's fucking necessary as I need you to understand how untenable things are are. SoftBank is putting itself in dire straits simply to fund OpenAI once this deal threatens its credit rating, with SoftBank having to take on what will be multiple loans to fund this $40 billion round and OpenAI will need at least another $40 billion a year later. This is before you consider the other $19 billion that SoftBank has agreed to contribute to the data center project with Stargate money it does not currently have available. Now OpenAI has promised $19 billion to the Stargate Data center to Project 2 and again they do not have need SoftBank to give it to them. And again, I've said it and I'll say it again, neither of these companies have the money. The money is not there and OpenAI needs Stargate to get built to grow much further. I see no way in which OpenAI can continue to raise money at this rate, even if OpenAI somehow actually receives the $40 billion it's been promised, which will require it to become a for profit entity, which I don't think it can fucking do do. While it could theoretically stretch that $40 billion to last multiple years, projections say it will burn $320 billion in the next five years or more likely. I can't see a realistic way in which OpenAI gets the resources it needs to survive. It'll need this insane streak of good fortune, the kind of which you only really hear about in Greek epic poems or JoJo's Bizarre Adventure. You know, the more cultured choice, but let's go through them somehow. SoftBank gets the resources and loses the constraints required to bankroll this company forever. The world's wealthiest entities, those sovereign wealth funds mentioned in the last episode, sauds and so on, they pick up the slack until OpenAI reach profitability, which is a huge assumption. It's also assuming that OpenAI will have enough of these mega wealthy benefactors to provide it with the $320 billion they need to reach profitability, which it won't. They'll also need Crusoe and Core Scientific to turn out to be really good at building AI infrastructure, which they've never done before, which is that. That's very possible, I'm sure. And then Microsoft will then walk back its walk back on building new AI infrastructure and recommit to tens of billions of dollars of CapEx, specifically on AI data centers, and also will give it to OpenAI. And then of course, Stargate's construction happens faster than expected and there are no supply chain issues in terms of labor, building materials, GPUs and so on. Now, I don't know, I haven't checked the news in the last three weeks, but is there anything going on that might increase the cost of materials? Probably not. Anyway, if those things happen, I'll eat Croat. I'm not particularly worried in the present conditions, OpenAI is on course to run out of money or run out of compute capacity, and it's unclear which will happen first. But what is clear is it's time to wake up. Even in a hysterical bubble where everybody is agreeing that this is the future, OpenAI is currently currently requiring more money and more compute than is reasonable to acquire. Nobody, nowhere, ever, anywhere, has ever raised as much money as OpenAI needs to. And based on the sheer amount of difficulty that SoftBank is having raising the funds to meet the lower tranche, the $10 billion 1 of its commitment, it may not actually be possible for this company to continue even with the extremely preferential payment terms, months long deferred payments, for example, that OpenAI probably has. At some point someone will need a DOL$. I'll give Sam Altman some fucking credit. He's found many partners to shoulder the burden of the rotten economics of OpenAI, with Microsoft, Oracle, Crusoe and Core Weave handling the upfront costs of building the infrastructure and SoftBank finding the investors for its monstrous stupid round and the tech media mostly handling marketing for him, which is really nice. Great job everybody. He is, however overleveraged. OpenAI has never been forced to stand on its own two feet or focus on efficiency. And I believe the constant enabling of this ugly nonsensical burn rate assumed this company, OpenAI has acted like it'll always have more money in compute. And that's kind of because everyone's acted as that would be the case. No one's really called Sam Altman out on his bullshit. There are some people, but really no one in the mainstream media is bothered, really. Sam Altman has been enabled. OpenAI, by the way, cannot just make things cheaper at this point because the money has always been there to make things more expensive, as has the compute to make larger and larger language models that burn billions of dollars a year. Year. This company is not built to reduce its footprint in any way, nor is it built for a future in which it wouldn't have access to infinite resources. Worse still, investors in the media have run cover for the fact that these models don't really do much more than they did a year ago, and for the overall diminishing returns of large language models writ large. Now, I've had many people attack my work about OpenAI, but none of them, not one of them, nobody has provided me any real counterpoint to the underlying economic argument I've made since July of last year that OpenAI is unsustainable. Now, this is likely because there really isn't one other than OpenAI will continue to raise more money than anybody has ever raised in history, in perpetuity, and will somehow turn the least profitable company of all time into a profitable company. This is not a rational argument, it's a religious one. It's a call for faith. And it's disgusting to see well paid reporters with, I don't know, 150,000 subscribers to their newsletters and a really shitty podcast with a major news outlet constantly just ignore this shit. And I see no greater pale horse of the apocalypse than Microsoft's material Pullback on data centers. While the argument might be that Microsoft wants OpenAI to have an independent future, that's fucking laughable when you consider Microsoft's deeply monopolistic tendencies. And for that matter, it owns a massive proportion of OpenAI's pseudo equity. @ one point, Microsoft's portion was valued at 49%. And while additional fundraising has likely diluted Microsoft's stake, it still owns a massive, massive portion of what is, at the very least, if you believe any of this nonsense, the most valuable private startup of all time. And we're supposed to believe that Microsoft's pullback, which limits OpenAI's access to infrastructure, it needs to train its and run its models, and thus, as mentioned, represents an existential threat to the company. You're meant to believe that this is because of some sort of paternal desire to see OpenAI leave childhood behind, to spread its wings and enter the real world. Are you fucking stupid? Stupid. Sorry, I shouldn't be calling people stupid. I shouldn't. I really shouldn't. But I am. More likely Microsoft got what it needed out of OpenAI, which has reached the limit of the models it can develop and what which Microsoft, by the way, already owns the IP of due to their 2019 funding round. There's probably no reason for Microsoft to make any further significant investments other than just kind of throwing a little cash in there and then I imagine some sort of tax dodge. I'm just guessing. It's also important to note that absolutely nobody other than Nvidia is making any money from generative AI. Coreweave loses billions of dollars, OpenAI loses billions of dollars, Anthropic loses billions of dollars, and I can't find a single fucking company providing generative AI powered software that's actually making a profit. The only companies even close to doing so are consultancies providing services to train and create data for models like Turing and Scale. AI and Scale isn't even fucking profitable. Now, the knock on effects of OpenAI's collapse will be wide ranging. Neither Coreweave nor Crusoe will have tenants for their massive unsustainable operations, and Oracle will have nobody to sell compute to because they've leased that thing for 15 fucking years for one customer. Who else is going to take that anyway? Core Weave will likely collapse under the weight of its abominable debt anyway, which will lead to a 6, 7% or more revenue drop for Nvidia at a time when revenue growth has already begun to slow. Slow on a philosophical level too. OpenAI's health is what keeps this industry alive. OpenAI has truly the only meaningful user base in generative AI and this entire hype cycle has been driven by its success. Meaning any deterioration or collapse of OpenAI will tell the market what I've been saying for over a year. The generative AI is not the next hyper growth market and its underlying economics do not make sense. But look, I'm not saying this to be a hater. I'm not saying this to be right. This stuff has driven me insane. But I'm not doing it to be a pundit, to be a skeptic, to be a cynic, to be someone that hates because I want to hate and I hate them. Not because I think people like me, because I hate them. I hate them because I have brain worms. I have something wrong with me inside my brain that tells me I have to be like this and I have to look at these things and I have to try and find what's going on. Otherwise I will be driven and mad. Which is why I'll say if something changes, if I'm wrong somehow, I promise you I will tell you exactly how, exactly why, and what mistakes I made to come to the conclusions I have in this episode and the episodes before. But I don't believe that my peers in the media will do the same when this collapses. But I promise you that they will be held accountable because all of this abominable waste could have been avoided. Large language models are not on their own the problem. They're tools capable of some outcomes, doing some things. But the problem ultimately are the extrapolations made about their abilities and the unnecessary drive to make them larger, even if said largeness never really amounted to much. Everything that I'm describing is the result of a tech industry, including media and analysts, that refuses to do business with reality, trafficking in ideas and ideology, celebrating victories that have yet to take place, applauding those who have yet to create the things that they're taught, talking about, cheering on men, lying about what's possible so that they can continue to earn billions of dollars and increase their wealth and influence for barely any fucking reason. I understand why others might not have said what I've said. What I am describing is a systemic failure, one at a scale hereto unseen, one that has involved so many rich and powerful and influential people agreeing to ignore reality. And that'll have crushing impacts for the wider tech ecosystem when it happens. Don't say I didn't warn you. Thank you for listening to Better Offline, the editor and composer of the Better Offline theme song is Matt Osowski. You can check out more of his music and audio projects@matasalski.com M A T T O S. You can email me@easyteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter. I also really recommend you go to chat where's your ed to visit the Discord and go to R betteroffline to check out our Reddit. Thank you so much for listening. Better Offline is a production of Cool Zone Media. For more from Coolzone Media, visit our website website coolzone media.com or check us out on the iHeartRadio app, Apple Podcasts.
A
Or wherever you get your podcasts. In a world of economic uncertainty and workplace transformation, learn to lead by example from visionary C Suite executives like Shannon Schuyler of PwC and Will Pearson of iHeartMedia, the Good Teacher explains the great Teacher inspires.
B
Don't always leave your team to do the work that's been the most important part of how to lead by example.
A
Listen to Leading by Example executives making an impact on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Better Offline: Episode Summary
Title: OpenAI Is A Systemic Risk To The Tech Industry
Host: Ed Zitron (Cool Zone Media and iHeartPodcasts)
Release Date: April 18, 2025
In the second episode of his two-part series, Ed Zitron delves deep into the precarious financial and operational state of OpenAI, arguing that the organization's current trajectory poses a systemic risk to the entire tech industry. Zitron meticulously dissects the intricate web of funding, partnerships, and dependencies that make OpenAI a linchpin in the AI ecosystem, drawing alarming parallels to the 2007-2008 financial crisis.
Zitron begins by outlining OpenAI's massive $40 billion funding round, predominantly backed by SoftBank. However, he highlights critical stipulations tied to this investment:
Nonprofit to For-Profit Conversion:
"As part of its deal with SoftBank, OpenAI must also convert its bizarre nonprofit structure into a for-profit entity by December 2025 or it will lose $10 billion from that $40 billion round of funding." ([00:36])
Clawback Clauses:
Failure to transition by October 2026 allows investors from the previous $6.6 billion round to claw back their investments with interest, exponentially increasing OpenAI's financial burdens.
These conditions create a "nightmare scenario," potentially forcing OpenAI to dissolve, restructure, and sell or license its assets under unfavorable terms. Zitron underscores the complexity of this process, noting the numerous legal and regulatory hurdles involved.
Drawing an analogy to Lehman Brothers' collapse during the financial crisis, Zitron posits that OpenAI, while not the largest tech company, serves a similarly critical role due to its unique position:
Dominant Large Language Model (LLM) Provider:
"ChatGPT is the only large language model company with any meaningful use base." ([Various timestamps])
Financial Dependencies:
Major tech companies rely on OpenAI's capabilities for their own operations and expansion plans. The potential failure of OpenAI could ripple through the tech sector, affecting customers and partners alike.
Zitron discusses Oracle's $1 billion financial commitment to the Stargate data centers, built exclusively for OpenAI. Any failure by OpenAI to meet its obligations would leave Oracle grappling with massive debts:
The relationship between OpenAI and CoreWeave highlights vulnerabilities within the AI infrastructure:
CoreWeave's Dependency:
"Core Weave has become one of our earliest and largest compute partners... this deal... are pretty much critical to Core Weave's future." ([20:30])
Core Scientific's Lack of Expertise:
Core Scientific, tasked with building data centers for CoreWeave, has a dubious track record with no prior experience in AI data center construction, exacerbating the risk of failure.
Zitron draws a bleak picture of the financial sustainability of OpenAI and its ecosystem:
Debt and Capital Expenditures:
CoreWeave faces ballooning debt payments exceeding $2.4 billion annually, while OpenAI's relentless pursuit of compute resources fuels an unsustainable burn rate.
Nvidia's Exposure:
"Core Weave spends $8 billion on Nvidia, representing more than 6% of Nvidia's 2024 revenue." ([30:15])
The interconnectedness means that OpenAI's potential collapse could trigger a domino effect, severely impacting Nvidia, Oracle, CoreWeave, and beyond.
Zitron critically examines the competence of companies like Crusoe and Core Scientific in building AI-focused data centers:
Lack of Proven Capability:
"Crusoe does not appear to have built a single AI data center... Core Scientific, like Crusoe, does not have the expertise required." ([35:00])
Delays and Underperformance:
Projects like the Stargate data center are significantly behind schedule, casting doubt on their feasibility and raising questions about the overall strategy.
Microsoft's involvement with OpenAI is scrutinized, particularly regarding revenue recognition:
Compute Revenue Uncertainty:
Zitron questions whether Microsoft's Intelligent Cloud segment accurately reflects revenue from OpenAI's compute usage, pointing out inconsistencies and opaque accounting practices.
Strategic Pullback:
Microsoft's reduction in data center investments heightens the existential threat to OpenAI, limiting its access to essential infrastructure.
Zitron connects OpenAI's struggles to the wider tech investment landscape, highlighting declining deal volumes and investment stagnation:
This downturn is attributed to the overinflated hype around generative AI, primarily driven by OpenAI's dominant yet unstable position.
Zitron presents a grim forecast for OpenAI and the AI sector at large:
Unsustainable Growth Model:
OpenAI's need for continuous, massive fundraising is untenable, with projections indicating a requirement of at least $40 billion annually until 2030.
Systemic Collapse Risks:
The intertwined dependencies on underperforming partners and financial overextension make a collapse not just likely but catastrophic for the tech ecosystem.
"OpenAI is on course to run out of money or run out of compute capacity, and it's unclear which will happen first." ([42:10])
Ed Zitron concludes with a fervent warning to listeners and stakeholders:
Call to Awareness:
"It's time to wake up. Even in a hysterical bubble where everybody is agreeing that this is the future, OpenAI is currently requiring more money and more compute than is reasonable to acquire." ([43:50])
Accountability Demanded:
Zitron criticizes the media and industry for ignoring these red flags, asserting that the impending collapse should be a wake-up call to prevent widespread fallout.
Personal Commitment:
He vows to update his analysis should new information emerge, emphasizing the importance of recognizing and addressing these systemic risks.
In this episode, Ed Zitron provides a sobering analysis of OpenAI's structural and financial vulnerabilities, arguing that its potential downfall could have cascading effects across the tech industry. By meticulously detailing the dependencies and financial obligations tethering OpenAI, Zitron urges stakeholders to recognize and address these systemic risks before they culminate in a broader economic fallout.
For more insights and detailed analyses, visit Better Offline or follow Ed Zitron on his social platforms.