
Stargate gets off the ground, or at least, announces it is. What if OpenAI leases the chips from Nvidia instead of buying them. I go a bit deep in this episode in terms of the incentives behind all these AI moves. And what happens if AI makes translation as ubiquitous as spell check?
Loading summary
A
Welcome to the Tech Brew ride home for Wednesday, September 24, 2025. I'm Brian McCullough. Today, Stargate gets off the ground, or at least announces it is what if OpenAI leases the chips from Nvidia instead of buying them? I go a bit deep in this episode in terms of the incentives behind all these recent AI moves. And what happens if AI makes translation as ubiquitous as Spellcheck? Here's what you missed today in the world of tech. OpenAI, Oracle and SoftBank have announced five new data center locations in the US as part of the Stargate initiative, boosting Stargate's planned capacity to nearly 7 gigawatts. Quoting Wired AI is different from the Internet in a lot of ways, but one of them is just how much infrastructure it takes, OpenAI CEO Sam Altman said during a press briefing in Abilene, Texas on Tuesday. He argued that the US Cannot fall behind on this and the innovative Spirit of Texas provides a model for how to scale bigger, faster, cheaper, better. Three of the new sites in Shackelford County, Texas, Dona Ana County, New Mexico, and a yet to be disclosed location in the Midwest are being developed in partnership with Oracle. The move follows an Agreement Oracle and OpenAI announced in July to develop up to 4.5 gigawatts of US data center capacity on top of what the two companies are already building at the first Stargate facility in Abilene. OpenAI claims the new data centers, along with a planned 600 megawatt expansion of the Abilene site, will create more than 25,000 on site jobs, though the number of workers required to build data centers typically dwarfs the amount needed to maintain them afterwards. The two remaining sites are being helmed by OpenAI and SB Energy, a SoftBank subsidiary that develops solar and battery projects. These are located in Lordstown, Ohio and Milam County, Texas. Stargate is one of several major U.S. technology infrastructure projects that have been announced since President Donald Trump took office at the start of the year. OpenAI said in January that the $500 billion 10 gigawatt commitment between the ChatGPT makers SoftBank, Oracle and MGX would, quote, secure American leadership in AI and create hundreds of thousands of American jobs. OpenAI initially framed Stargate as a new company that would be chaired by SoftBank CEO Masayoshi Son. Now, however, executives close to the project say it's an umbrella brand name used to refer to all of OpenAI's data center projects except those developed in partnership with Microsoft. The flagship site in Abilene is primarily owned and operated by Oracle, with OpenAI acting as the primary tenant, according to executives. Close the project. The build out, which is being managed by the data center startup Caruso, is on track to be completed by mid-2026, sources close to the project say. It is already running on Oracle cloud infrastructure and supporting OpenAI training and inference workloads, those sources add. Oracle is in the process of standing up eight data center halls in Abilene, which will each support roughly 100 megawatts of power. One of those buildings is currently complete, and another one is close to being finalized. When it is done, the facility will house more than 4,400,000 GPUs and support more than 1.4 gigawatts of power, sources close to the project say. The initial Stargate announcement called for companies with land, equipment and other relevant resources to get involved. After it was published, OpenAI executives say they were inundated with messages from firms that wanted to participate. One executive estimates that OpenAI has surveyed around 700 sites in the US where data centers could potentially be built for the five projects announced Tuesday. OpenAI claims it reviewed more than 300 proposals together with its partners from roughly 30 states. In a policy white paper released Tuesday, OpenAI framed AI infrastructure as a crucial tool the United States will need to beat China and become a manufacturing powerhouse. Today, Communist Party led China is developing electricity resources at unmatched velocity while the US Falls behind, the document argues. China's immense electricity consumption underscores not only its industrial dominance, but its ability to rapidly deploy AI infrastructure, including new data centers, semiconductor fabs and manufacturing hubs. Again, to underline this, Oracle will pay for and oversee the construction of three of these new Stargate data centers, and OpenAI will then purchase computing power from Oracle. And so on top of that news, news that Oracle is looking to borrow $15 billion through the US sale of corporate bonds as it begins to fulfill those massive cloud infrastructure deals with OpenAI. Meanwhile, sources say OpenAI and Nvidia are discussing structuring their new AI data center partnership so that OpenAI would lease Nvidia's AI chips instead of buying them. Quoting the information, leasing server chips from Nvidia could ease the financial burden on OpenAI, which is already burning billions of dollars in cash a year due to high computing costs. OpenAI has estimated a leasing arrangement would lower the cost of the server chips 10 to 15% compared to buying them, according to one of the people, though it isn't clear how that was calculated. And by renting the hardware, OpenAI wouldn't need to raise tens of billions of dollars to purchase it outright. It also frees OpenAI from the risk the chips could become outdated sooner than expected. OpenAI already rents Nvidia chip servers from Microsoft and Oracle. It isn't clear how the cost of a leasing arrangement with Nvidia would compare to what OpenAI is already spending on server rentals. The leasing deal could be structured to minimize risks for Nvidia as well. Nvidia could set up an entity that borrows money to buy the servers, using the chips as collateral. OpenAI's lease payments could go toward paying back the loan. That type of deal is the only viable path to building enough data centers for AI, said Aaron Ginn, the CEO of Hydra, whose software helps data center firms generate revenue from Nvidia chips. He isn't involved in the deal, end quote. Actually, there are plenty of uncertainties surrounding Nvidia's $100 billion investment in OpenAI, as a big piece in the Financial Times notes. Again, it raises questions about a seeming circular arrangement here. Nvidia finances OpenAI, which then spends heavily on Nvidia hardware. Critics are flagging this structure, but analysts say Nvidia's surging cash flow can support it. And if fully realized, the buildout could yield hundreds of billions of dollars in revenue for them. Nvidia's stock has risen about 10x since ChatGPT's late 2020, and the partnership would further entrench Nvidia as indispensable AI infrastructure. OpenAI, meanwhile, is hedging striking a custom chip pact with Broadcom, yet remains capacity constrained despite claiming 700 million weekly ChatGPT users. The plan, remember, targets at least 10 gigawatts of compute, which the IEA says would consume as much electricity annually as 10 million US homes. Jensen Huang has estimated in the past that each gigawatt requires roughly $50 billion in hardware spanning Nvidia processors, network and servers from partners like Foxconn, HP, Dell and Supermicro. OpenAI says this is separate from Stargate, which we just discussed. The wager intensifies the AI arms race, even as Bain is warning the sector may undershoot the revenues needed to support the projected $500 billion in annual AI capex they expect by 2030. In other words, there's still no guarantee that the revenue will be there to pay for all of this. Still, Nvidia's Cuda software ecosystem, akin to Microsoft's and Apple's platform lock in, makes switching costly, bolstering Nvidia's bet that access to infrastructure, not ideas, will be the ultimate gate to AI progress. So sure, look, all of this AI stuff might just be a big bubble. There's no guarantee yet that even if you build all this out, you can get enough revenue from AI actual, you know, usage to cover the costs. But what you have to keep in mind about all of this is the incentives here for the big players, the folks making these decisions like announced you're spending big on AI. Your stock goes up. If you announce nothing, if you sit on your hands, your stock will get killed because the market will fear you're falling behind in AI. Here's just the latest example of that. Alibaba's Hong Kong listed shares hit a nearly four year high after CEO Eddie Wu announced plans to increase AI spending beyond the $53 billion target already announced over three years. The market rewards him for saying, no, no, actually we're going to spend even more on AI than we thought. Quoting Bloomberg Chief Executive Officer Eddie Wu anticipates overall investment in artificial intelligence accelerating to some $4 trillion worldwide over the next five years. And Alibaba needs to keep up. The company will soon add a plan laid out in February to spend more than 380 billion yuan, or $53 billion, developing and infrastructure over three years, he said. His cloud division, which already operates services from the US To Australia, intends to launch its first data centers in Brazil, France and the Netherlands in the coming year. Wu made his projections while outlining plans to roll out Quin models and full stack AI technology, reflecting Alibaba's growing ambitions to both develop services and the infrastructure, such as chips that underpin the technology. Its shares rose as much as 9.7% in Hong Kong, helping lift Chinese chip makers ACM Research and NARA Technology. In the most recent quarter, Alibaba reported triple digit growth in its AI related products. Its cloud division also posted better than expected 26% jump in sales, making it the group's fastest growing unit. The company's stock has more than doubled this year. Companies only gain confidence to invest more when the visibility of the returns improves, said Vasern Ling, managing director at Union Bancare Prive. So when they say they are raising investments in AI, it indicates good demand from customers and good roi. So either everybody is spending because they are seeing tangible signs of AI juicing business, or else it's just a prisoner's dilemma thing. You can't be seen to be falling behind. Maybe this isn't just another tech fad. Maybe this is a fundamental revolution in the entire economy and the foundations of that are being laid now. Like a new industrial revolution. But also remember what everybody learned from the last 25 years of the tech revolution. Basically, only five big players won. I mean, a lot of people won. But if you look at the overall market, five big players basically control what, 80% of it. And those five players basically had insane margins, printed money, had insurmountable moats, and were thus able to spread into adjacent markets and business lines. Again, incentives. Fear is an incentive. The fear here would be if this plays out like the tech revolution of the last 25 years did, if you're not one of the biggest five winners, or even two or three winners, you'll be roadkill.
B
Eczema isn't always obvious, but it's real. And so is the relief from EBGLIS. After an initial dosing phase, about 4 in 10 people taking EBGLIS achieved itch relief and clear or almost clear skin at 16 weeks. And most of those people maintain skin that's still more clear at one year with monthly dosing.
C
EBGLIS Librekizumab LBKZ, a 250mg 2ml injection, is prescription medicine used to treat adults and children 12 years of age and older who weigh at least 88 pounds or 40 kilograms with moderate to severe eczema, also called atopic dermatitis, that is not well controlled with prescription therapies used on the skin or topicals, or who cannot use topical therapies. EBGLIS can be used with or without topical corticosteroids. Don't use if you are allergic to ebglis. Allergic reactions can occur that can be severe. Eye problems can occur. Tell your doctor if you have new or worsening eye problems. You should not receive a live vaccine when treated with Epglis. Before starting epglis, tell your doctor if you have a parasitic infection searching for real relief.
B
Ask your doctor about Epglis and visit epglis or call 1-800-LILYRX or 1-800-545-5979.
A
Curious about the digital infrastructure that powers today's biggest tech? Then check out the Interconnected Podcast. It's a new series from Equinix that explores emerging tech megatrends and the infrastructure making them possible. Through candid conversations with industry experts, the show breaks down highly complex topics in a way that's easy to understand. In the very first episode, covering AI in the medical field, the hosts do a great job helping listeners understand the computational mod and interconnected data systems powering today's most promising medical developments. From there, they revealed the connection between this infrastructure and the groundbreaking healthcare breakthroughs we're seeing lately, like rapid diagnostics, personalized medicine, and supercharged drug discovery. Stay tuned into the digital infrastructure powering today's biggest tech trends with interconnected Give it a listen and follow the show on YouTube, Apple, Spotify, or wherever you get your podcasts.
D
I'm Christian McCaffrey, pro running back and Abercrombie is an official fashion partner of the NFL. I'm not kidding when I say NFL by Abercrombie broke the Internet last year and I think this season's lineup is even cooler. And so does my wife who keeps stealing all my hoodies. Stay fit for the season and Abercrombie's newest arrivals Shop NFL by Abercrombie in the app, online and in store.
A
Meanwhile, listen to this from Bloomberg about how frothy things are getting if you're the founder of the hottest of the hot AI startups. You know things like how about you can use our own private jet if you sign on the dotted line? Quote Raising venture capital money has been a breeze for decagon AI, a two year old startup developing artificial intelligence tools for customer service. All four of Decagon's funding rounds, totaling more than $230 million, were preempted, meaning firms like Andreessen Horowitz offered to invest before the company started fundraising. Now, just three months after a round that valued it at $1.5 billion, Decagon is fielding unsolicited offers at valuations as high as $5 billion, according to people familiar with the matter. For Decagon and a few dozen other top AI startups, the fundraising script has been flipped. Instead of pitching venture capitalists up and down Sandhill Road, VCs are pitching them, hovering with gifts and favors in hopes of leading their next round. Jesse Zhang, the 28 year old co founder and chief executive officer of Decagon, says investors hoping to back him have offered him everything from tickets to Golden State warriors games to an autographed poster of an MMA legend. One investor even folded origami cranes into a mosaic of Decagon's logo and hand delivered it to the company's San Francisco office with a term sheet hidden inside. Decagon took the deal. Investors are emailing term sheets, they're giving verbal offers, they're inviting founders to sport games, they're inviting founders to race Ferraris, and they're inviting them on private jets, said Bennett Siegel, a co founder of investment firm Asterix and an early investor in decagon. What you tend to see is the best companies are getting preempted every round, and the time between rounds is shrinking. The lavish VC overtures are part of a larger Silicon Valley frenzy for AI, driven by startups astonishing revenue growth, and investors believe that these companies can dethrone tech giants. US startups have raised around $200 billion this year so far, according to PitchBook data, but 41% of that went to just 10 companies, highlighting VC's relentless focus on a small group of AI frontrunners like OpenAI and Anthropic. Last year, the share of funding that went to the top 10 companies was less than 25%, end quote. Purely from my Consumer Reports slash household budgeting file Quoting the Wrap Disney plus Hulu and ESPN select plan will get more expensive starting in October, with notifications rolling out to subscribers on Tuesday, the Wrap has learned the changes are set to go into effect on October 21st for new subscribers. Current subscribers will see the change on their first billing date on or after October 21st. The pricing changes will be as follows. Disney with Ads will increase by $2.00 to $11.99 per month. Disney Premium will increase by $3.00 to $18.99 per month and $30.00 to $189.00 per year. Hulu with ads will increase by $2 to $11.99 per month and by $20 to $119.99 per year. ESPN select will increase by $1 to $12.99 per month and $10 to $129.99 per year. Actually, I'm not going to read the whole thing because basically everything you might have subscribed to over there is going up though. Things like the Big Daddies, the Hulu with Ads, Live tv, Disney with Ads and ESPN select with Ads Bundle will increase by $7 to $89.99 per month. Cable bundle prices people finally today, here's another way to look at this AI moment. There was a time when spell checkers were a gee whiz technology and then spell checking everything where anyone at any point manipulates text. And now it's just table stakes for any application doing anything with text at all. What if AI is sort of like that? Here's an example. Meta has started rolling out built in message translation in WhatsApp. With support for more than 19 languages on iOS and 6 on Android. AI is making instant translation table stakes for any app, even video, even IRL social interactions. Quoting the Verge it can be activated by long pressing down on messages and tapping the Translate option to choose the language you want the message to be translated from or to support for English, Spanish, Hindi, Portuguese, Russian and Arabic will initially be available for Android users. While iPhone users can translate messages into more than 19 languages at launch, Android users can also enable automatic translation for entire chat threads to apply the feature to all incoming messages. It's no real time translation in your ears, in front of your eyes or on your screen, and other messaging apps have also offered similar features. Google used WhatsApp to demonstrate tap to Translate on Android nearly a decade ago, but for WhatsApp's billions of users, it could make conversations easier as they move throughout the world. WhatsApp says it plans to expand translation support for additional languages in the future. End quote. I've said before that if somebody had rolled out a universal translator back in, say, 1986, they'd be on the COVID of Time magazine. They'd win the friggin Nobel Prize. Today. Universal translation is just gonna be everywhere. Talk to you tomorrow.
Date: September 24, 2025
Host: Brian McCullough
This episode of Tech Brew Ride Home dives deep into the economic incentives and infrastructural maneuvers fueling the current artificial intelligence (AI) arms race, focusing particularly on OpenAI’s Stargate initiative, new models of funding for massive AI infrastructure, the latest frenzied investment activity in hot AI startups, and the normalization of AI-powered features like translation. Brian explores not only the “what” of today’s AI escalation but, crucially, the “why”—examining how competitive and financial incentives shape industry decisions, echoing lessons from the last quarter-century of tech disruption.
Announcement & Scale:
OpenAI, alongside Oracle and SoftBank, announces five new US data centers under the "Stargate" banner, bumping planned capacity to nearly 7 gigawatts.
[00:04]
“OpenAI, Oracle and SoftBank have announced five new data center locations in the US as part of the Stargate initiative, boosting Stargate's planned capacity to nearly 7 gigawatts.”
Locations:
Major sites in Texas, New Mexico, the Midwest (undisclosed), Ohio, and another Texas county.
Three built with Oracle, remaining two with SB Energy (SoftBank’s renewables arm).
Partnerships & Structure:
Stargate once appeared as a standalone OpenAI/SoftBank company but is now more of an umbrella brand for OpenAI infrastructure outside of its Microsoft partnership. Oracle will fund and operate several sites, leasing compute to OpenAI; OpenAI surveys hundreds of locations for expansion.
Scale of Construction:
The flagship Abilene, Texas, facility will support over 4.4 million GPUs and 1.4 gigawatts of power at completion, with Oracle already operational on at least one new data center hall.
Job Creation Claims:
OpenAI touts 25,000 jobs from construction—though most are temporary and far fewer are needed for ongoing operation.
Geopolitical Framing:
In a fresh white paper, OpenAI casts this infrastructure as critical for the US to “beat China” in AI, citing Chinese dominance in energy and rapid infrastructure deployment.
[00:04]
"In a policy white paper released Tuesday, OpenAI framed AI infrastructure as a crucial tool the United States will need to beat China and become a manufacturing powerhouse."
OpenAI–Nvidia Deal Structure:
Negotiations underway for OpenAI to lease, not own, Nvidia’s AI chips—potentially easing the up-front cash burden and mitigating hardware obsolescence risk.
[06:15]
"Leasing server chips from Nvidia could ease the financial burden on OpenAI, which is already burning billions of dollars in cash a year due to high computing costs."
Financial Rationale:
Leasing could shave 10–15% off effective hardware costs and sidestep the need to raise tens of billions for capex.
Industry Implications:
NVIDIA’s possible financing play—an entity buys the chips, OpenAI pays via lease—could become standard for titanic-scale AI builds.
Bain warns the industry may not generate enough revenue to support projected $500B in annual AI spending by 2030—the sector could be building for demand that never fully materializes.
Circular Incentives & Criticisms:
The cash loop of “Nvidia funds OpenAI, OpenAI buys Nvidia gear” prompts concern about sustainability and market concentration, but, so far, Nvidia’s ballooning profits allay fears.
[07:30]
“It raises questions about a seeming circular arrangement here. Nvidia finances OpenAI, which then spends heavily on Nvidia hardware. Critics are flagging this structure, but analysts say Nvidia's surging cash flow can support it… the buildout could yield hundreds of billions of dollars in revenue for them.”
Stock Markets Reward AI Announcements:
AI capex disclosures reliably boost stock prices—even in the absence of clear ROI.
Case Study: Alibaba
After CEO Eddie Wu upped AI investment plans beyond $53B, Alibaba shares jumped nearly 10%; the firm’s cloud and AI units are reporting record growth, spurring similar behavior among competitors.
[09:40]
"Alibaba's Hong Kong listed shares hit a nearly four year high after CEO Eddie Wu announced plans to increase AI spending beyond the $53 billion target already announced over three years. The market rewards him for saying, no, no, actually we're going to spend even more on AI than we thought."
Fundamental Question:
Brian wonders if this is rational, based on demonstrated business value—or if it’s a “prisoner’s dilemma” where no company dares risk being left behind in AI, regardless of cost or clear ROI.
[10:53]
"So either everybody is spending because they are seeing tangible signs of AI juicing business, or else it's just a prisoner's dilemma thing. You can't be seen to be falling behind."
The Tech Lesson:
The episode reiterates the winner-takes-all nature of previous tech revolutions, arguing that the fear of missing out on AI’s next kingship is itself a driving force.
[11:00]
"What everybody learned from the last 25 years of the tech revolution. Basically, only five big players won... if you're not one of the biggest five winners, or even two or three winners, you'll be roadkill."
Venture Capital Courtship:
Citing Bloomberg, Brian relays an overheated climate where top AI founders—like Decagon AI’s Jesse Zhang—are chased by eager VCs luring them with private jets, game tickets, and even origami art with term sheets hidden inside.
[13:27]
“Investors are emailing term sheets, they're giving verbal offers, they're inviting founders to sport games, they're inviting founders to race Ferraris, and they're inviting them on private jets... The best companies are getting preempted every round, and the time between rounds is shrinking.”
Funding Concentration:
Of $200B in US startup funding this year, 41% went to just ten companies (including megastars OpenAI, Anthropic), a stark climb from 25% the year before.
AI-Powered Translation in Messaging:
Meta adds built-in translation to WhatsApp, making real-time, cross-language chat seamless for billions—a feature now as expected as spell-check.
[14:56]
“Meta has started rolling out built in message translation in WhatsApp... AI is making instant translation table stakes for any app, even video, even IRL social interactions.”
The Big Picture:
Features that were once science fiction (universal translators) are now so common as to be almost invisible; AI is taking on a role similar to electricity or spell-check—basic digital infrastructure.
[16:00]
“I've said before that if somebody had rolled out a universal translator back in, say, 1986, they'd be on the COVID of Time magazine. They'd win the friggin Nobel Prize. Today. Universal translation is just gonna be everywhere.”
On AI Infrastructure:
“AI is different from the Internet in a lot of ways, but one of them is just how much infrastructure it takes.”
— Sam Altman (relayed by Brian, [00:13])
On Market Incentives:
“Announce you're spending big on AI. Your stock goes up. If you announce nothing, if you sit on your hands, your stock will get killed because the market will fear you're falling behind in AI.”
— Brian McCullough, [08:45]
On AI-Fueled Investment Mania:
“VCs are pitching them... Investors are emailing term sheets... What you tend to see is the best companies are getting preempted every round, and the time between rounds is shrinking.”
— Bennett Siegel, co-founder of Asterix ([13:27], quoted by Brian)
On AI as Table Stakes:
“What if AI is sort of like that [spellcheck]? ... Universal translation is just gonna be everywhere.”
— Brian McCullough, [16:00]
Brian’s delivery is brisk, informed, and slightly wry. He isn’t afraid to voice skepticism about bubble-like conditions or to contextualize rapid innovations with bigger, long-term trends from tech history. Frequent use of quotes and attributions backs his points with credible sourcing.
In "The AI Incentives Game," Brian McCullough provides a tight, revealing look at how the AI boom’s underlying financial incentives—not just technological advances—are defining today’s tech titans and shaping the infrastructure of tomorrow. Whether you’re tracking industry giants’ datacenter land grabs, marveling at the bizarre perks VCs are offering AI founders, or just marveling that real-time translation is now mundane, this episode offers a crisp, comprehensive rundown of just how—and why—the AI arms race is accelerating.