Ed Zitron (11:17)
The numbers in the TD Cohen report, which I'll link in the episode spreadsheet, of course, heavily suggest that Microsoft, the biggest purchaser of Nvidia GPUs and according to TD Cohen and I quote, the most active data center lessee of capacity in 2023 and the first half of 2024, does not believe that there's future growth in generative AI. Nor does it have faith in, nor does it want responsibility for the future of OpenAI. Data center build outs take three to six years to complete and the largest hyperscaler facilities can easily cost several billion dollars. Meaning that these moves are extremely forward looking. You don't just build a data center for the demand you have now, but for the demand you expect further down the line. This suggests that Microsoft believes its current infrastructure and its likely scaled back plans for expansion will be sufficient for a movement that Satya Nadella once called a golden age for systems. He did that less than a year ago. To quote TD Cohen again, the magnitude of both potential data center capacity Microsoft walked away from and the decision to pull back on land acquisition, which supports core long term capacity growth in our view indicates the loss of a major demand signal that Microsoft was originally responding to and that we believe the shift in their appetite for capacity is tied to OpenAI. To explain here, TD Cohen is effectively saying that Microsoft is responding to a major demand signal and said major demand signal is saying you do not need more data centers. Said demand signal that Microsoft was responding to in TD Cohen's words is its appetite for capacity to provide servers to OpenAI. And it seems that said appetite is waning and Microsoft no longer wants to build out data centers for America's most swagged out AI guy. Now, I say that kind of as a joke, and I do think that he's more swagged out than Mark Zuckerberg. Mark Zuckerberg's trying too hard. However, Sam Altman is more damp than Mark Zuckerberg, so ultimately Zuckerberg wins. Now, I want to make it clear that Microsoft is effectively cutting its data center expansion by over a gigawatt of capacity, if not more. And it's impossible to reconcile these cuts with the expectation that generative AI will be this massive transformative technological phenomenon. I believe that the reason that Microsoft is cutting back is that it does not have the appetite to provide further data center expansion for OpenAI, and it's having doubts about the future of generative AI as a whole. If Microsoft believed that there was a massive opportunity in supporting OpenAI's further growth, or that it had massive demand for generative AI services, there'd be no reason to cancel capacity, let alone cancel such a significant amount. These moves also suggest that Microsoft is walking away from building and training further large frontier models like ChatGPT's GPT 4.5 now, and from supporting doing so for others. Remember, Microsoft has significantly more insight into the current health and growth of generative AI than any other company. Remember, they have full access to to all of OpenAI's tech. Probably the future stuff too. Not that there's much they know all their research too. Microsoft knows something we don't. As OpenAI's largest backer and infrastructural partner and the owners of the server architecture where they train them ultra expensive models, not to mention the largest shareholder in OpenAI, Microsoft can see exactly what is or isn't coming down the pike. On top of having a view into both the sales of its own generative AI powered software such as Microsoft 365 copilot, and sales of both model services and Cloud Compute for other models run on Microsoft Azure, which is their cloud cloud platform. In plain English, Microsoft, which arguably has more data than anybody else about the health of the generative AI industry and its potential for growth, has decided that it needs to dramatically slow down its expansion. Now, to be clear, this expansion I really am hammering this home a lot. But I need you to understand this is absolutely necessary for generative AI to continue evolving and expanding, even if it only does so in ways that kind of do the same thing again and again. Now, before we move on, I want to make it clear that I'm not saying that Microsoft has stopped building data centers. I've said it a few times. But these, these projects take years, three to six years to complete and are far, far in advance with their planning. And Microsoft does have a few big projects in the works. One planned 324 megawatt Microsoft data center in Atlanta is expected to cost $1.8 billion. And as far as I know, this deal is still in flight. However, and this was cited separately by TD Cohen, Microsoft has recently paused construction on parts of its $3.3 billion data center campus in Mount Pleasant, Wisconsin. While Microsoft had tried to reassure locals that the first phase of the project was on course to be complete on time, its justification for delaying the rest of it was, well, not brilliant. And it was to give Microsoft an opportunity to evaluate, and I quote, the project's scope and recent changes in technology and consider how this might impact the design of its facilities. Oh, Bob. Oh, buddy, that's not good. You don't want it. No one evaluates a scope and then goes, oh, actually the scope is great. I love it. Nor do they think about impacts and go, oh, let's do more. No, no, no. Anyway, the same Register article I'm citing here adds that, and I quote, the review process may include the need to negotiate some building permits, potentially placing another hurdle in the way of the project. The Register did add that Microsoft said it expected to complete one hyperscaler data center in Mount Pleasant as originally planned, though its capacity wasn't available. Arguably, Microsoft would expand its data center infrastructure anyway. As more stuff moves to the cloud and our dependence grows on it, Microsoft and other providers need to build capacity. The organic growth is natural and sadly inevitable. However, Microsoft's plans for data center expansion were far, far in excess of that natural growth. And perhaps we're seeing a pullback from those stated extravagances into something perhaps a little more reasonable. Now, I've talked a lot about megawatts and gigawatts, and if you're not in the data center business, and you should be, the parties are an absolute laugh. This can all seem a bit abstract, so let's put it into context. Without context, it's hard to understand how big a 100 megawatt data center is. These are some of the biggest. According to the International Energy Agency, small data centers can consume anywhere between 1 and 5 megawatts. These are, for the most part, average sized facilities, perhaps not for cloud compute giants, but for other companies it's kind of par for the course. 100 megawatts by comparison is huge. It's the equivalent of the annual energy consumption of between 350,000 and 4, 400,000 electric cars. And I know some sort of pedant is going to say it's not the same thing. Shut the fuck up. Go outside. Stop. Listen. Go outside. Go outside now. Go do something. Anyway. Although there are others that will likely dwarf what we today consider to be a large facility, Meta is in the process of constructing, for example, a $10 billion data center campus in Louisiana with a proposed 2 gigawatt capacity. Still, whatever way you cut it, and 100 megawatt facility is big, and it's a big long term investment. Cushman and Wakefield's 2024 Global Data center market comparison gives some chilling context into how significant Microsoft's pullback is. A gigawatt of data center capacity is roughly the entire operational IT load of Tokyo, which has a 1.028 gigawatt capacity, or London 996 MW, or the Bay Area which only has 842 megawatts. These are actually very large. It's just that Microsoft got rid of so much more. And again, the total figure of cancelled or abandoned capacity is likely far higher than a gigawatt. That number only accounts for the letters of intent that Microsoft allowed to expire. It doesn't include everything else, like the two data centers it already killed, or the land parcels it abandoned, or the deals that were in early to mid stages of negotiation. Imagine walking away from two Londons or two Tokyos of capacity and it not being a massive deal. This is a huge flipping deal. Microsoft is not simply walking back some future plans. It's effectively cancelling what it loudly insisted was the future. If you think this sounds hyperbolic, consider London and Tokyo are respectively the biggest data center markets in Europe and Asia. According to the same Cushman and Wakefield report. Canceling cities worth of capacity at a time when artificial intelligence is supposedly revolutionizing everything certainly suggests that artificial intelligence isn't really revolutionizing anything. Now, one other detail in TD Cohen's report really stood out to me. While there's pullback in Microsoft's data center leasing, it's also seen a commensurate rise in demand from Oracle related to the Stargate project, a rather relatively new partnership of up to $500 billion. Stop saying it's $500 billion to build massive new data centers for AI, specifically for one company led by SoftBank and of course OpenAI, with investment from Oracle and MGX and $100 billion investment fund backed by the United Arab Emirates, OpenAI has committed 18 to $19 billion to the Stargate project, money it doesn't have meaning that part of the 25 to 40 billion dollars that they're raising at the moment will be committed to funding these data centers unless, as I OpenAI raises more in debt. Leading the round is SoftBank, which is also committing $18 to $19 billion, as well as creating a joint venture fund called SB OpenAI Japan to offer OpenAI services to the Japanese market, something that I thought was already happening, as well as spending $3 billion annually to use OpenAI's technology across its group businesses, according to the Wall Street Journal. In simpler terms, SoftBank is investing as much as I think it's going to be like $30 billion in OpenAI, then spending another $3 billion a year on software that only lo hallucinates and shows no sign of getting meaningfully better or more reliable. Whether SoftBank actually sees value in OpenAI's tech, or whether this purchase deal is a subsidy by the backdoor is open to debate. Given that $3 billion is equivalent to OpenAI's entire revenue from selling premium access to ChatGPT in 2024, which included some major deals with the likes of PricewaterhouseCoopers, I'm inclined to believe the latter. Even then, how is it feasible that SoftBank can continue paying to get the deal done? Microsoft changed the terms of its exclusive relationship with OpenAI to allow it to work with Oracle to build out further data centers full of GPUs necessary to power OpenAI's big, shitty, unprofitable and unsustainable models. The OpenAI Oracle Stargate situation was a direct result, according to reporting from the information of OpenAI becoming frustrated with Microsoft for not providing it with servers fast enough, including an allotment of 300,000 of Nvidia's GBT 200 chips by the end of 2025. For what it's worth, the Wall Street Journal reports that Microsoft was getting increasingly frustrated with OpenAI's constant demands for more compute. The relationship between the two entities had start to fray, with both sides feeling kind of aggrieved. This, combined with Microsoft's data center pullback, heavily suggests that Microsoft is no longer interested in being OpenAI's infrastructure paypig long term. @ least after all, it was. If it was, I mean, it'd fund and support OpenAI's expansion rather than doing the literal opposite. And you have to wonder if when that whole non exclusive thing came along, whether Microsoft was kind of like, no, no, you couldn't possibly like the papers already at the pen, they've already got a stamp with Sam Altman's signature. Oh, don't sign it, it'd be so bad. No, I, I really don't. I don't think that Microsoft's too caught up about that. Hi, I'm Matt.