
Disney signs a blockbuster deal to license characters to OpenAI AND invest $1 billion dollars in the company. Oracle as the new bellwether for thinking about OpenAI’s prospects. More on the whole Data Centers In Space phenomenon. And let me introduce you to the Model Context Protocol to make the web safe for AI agents.
Loading summary
A
The Uniswap Wallet makes crypto easier and safer to own and use. Discover new tokens, research confidently, swap instantly, and manage it all securely in one place. The Uniswap trading protocol has powered over $3 trillion in volume and it's trusted by millions worldwide. Buy your first crypto assets in a few taps and experience the freedom of decentralized finance with Uniswap. Tap the banner to get started.
B
Welcome to the tech we write home for Thursday, December 11, 2025 I'm Brian McCullough today Disney signs a blockbuster deal to license characters to OpenAI and invest $1 billion in the company Oracle as the new bellwether for thinking about OpenAI's prospects more on the whole data centers and space phenomenon. And let me introduce you to the Model Context Protocol to make the web safe for AI agents. Here's what you missed today in the world of.
You may have noticed that your customers love webinar and video content, but if you've ever put together a webinar or a video, then you know that that can eat up a lot of your time and budget. But now, thankfully, there's a singular tool that can streamline your team's video and webinar workflows. Wistia Wistia can scale your content output with AI powered tools that help you create, edit and repurpose videos and webinars fast. And speaking of webinars, you can host engaging, easy to set up webinars in Wistia, complete with built in analytics. With Wistia you don't have to pay for multiple video tools, hop between platforms or constantly re upload files, create, edit, collaborate and publish all in one place. Head to wistia.com brew to learn more. That's W I S-T-I A.com brew with Wistia you can expect less work and more plays.
Disney and OpenAI this morning announced a deal to bring more than 200 Disney characters to Sora. This is part part of a deal where Disney is also making a $1 billion investment in OpenAI and will become a quote major OpenAI customer. Quoting Variety the Walt Disney Company and OpenAI have reached an agreement for Disney to become the first major content licensing partner on Sora, OpenAI's short form generative AI video platform. Bringing these leaders in creativity and innovation together to quote, unlock new possibilities in imaginative storytelling. End quote. As part of this new three year licensing agreement, SORA will be able to generate short user prompted social videos that can be viewed and shared by fans drawing from a set of more than 200 animated, masked and creature characters from Disney, Marvel, Pixar and Star wars, including costumes, props, vehicles and iconic environments. In addition, ChatGPT images will be able to turn a few words by the user into fully generated images in seconds drawing from the same intellectual property. The agreement does not include an any talent, likenesses or voices. Among the characters fans will be able to use in their creations are Mickey Mouse, Minnie Mouse, Lilo, Stitch, Ariel, Belle, Beast, Cinderella, Baymax, Simba and Mufasa, as well as characters from the worlds of Encanto, Frozen, Inside Out, Moana, Monsters, Inc. Toy Story Up, Zootopia, and many more, plus iconic animated or illustrated versions of Marvel and Lucasfilm characters like Black Panther, Captain America, Deadpool, Groot, Iron Man, Loki, Thor, Thanos, Darth Vade, Han Solo, Luke Skywalker, Leia, the Mandalorian, Stormtroopers and Yoda. Alongside the licensing agreement, Disney will become a major customer of OpenAI, using its APIs to build new products, tools and experiences, including for Disney plus and deploying ChatGPT for its employees. As part of the agreement, Disney will make a $1 billion equity investment in OpenAI and receive warrants to purchase additional equity.
And quoting the Wrap, the rapid advancement of artificial intelligence marks an important moment for our industry, and through this collaboration with OpenAI, we will thoughtfully and responsibly extend the reach of our storytelling through generative AI while respecting and protecting creators and their works, disney CEO Bob Iger said in a statement. Bringing together Disney's iconic stories and characters with OpenAI's groundbreaking technology puts imagination and creativity directly into the hands of Disney fans in ways we've never seen before, giving them richer and more personal ways to connect with the Disney characters and stories they love. End quote. Iger teased the incorporation of working with generative AI companies during Disney's recent earnings call for investors last month, and said that it would permit AI generated videos on Disney plus. We've been in some interesting conversations with some of the AI companies and I would characterize some of them as quite productive conversations as well, seeking to not only protect the value of our IP and our creative engines, but also to seek opportunities for us to use their technology to create more engagement with consumers, he said. The agreement comes after Disney has struggled internally with figuring out the best use cases for generative AI. As the Wrap reported last month, the company has focused on protecting its intellectual property from unauthorized uses by generative AI companies, such as its lawsuits against Midjourney and Minimax, but its own efforts were dealt a blow after its vice president of AI was shown the door in the summer, end quote.
We don't usually talk about Oracle earnings, but Oracle is one of those stocks that has become sort of a proxy for how the market is thinking about AI and especially investing in AI capex. So the fact that Oracle is down 15% this morning after reporting earnings yesterday that came in below estimates and a forecast of AI capex spend of around $50 billion, up from their previously reported $35 billion, which was estimated as recently as September. Well, I think we got to talk about it, quoting the journal as OpenAI is privately held, so investors have responded by selling off other companies with significant exposure to the startup as they increasingly start to get nervous about OpenAI's prospects. Few have been punished as hard as Oracle ahead of its financial second quarter report Wednesday afternoon. The stock had plunged 32% over the past three months, the third worst performance on the S&P 500 for that period, according to FactSet. That is a particularly brutal turn for a company that was nearing a $1 trillion market cap on the premise that AI computing was going to double the size of its business over the next few years. Oracle hasn't backed off that goal, but the company's latest report also didn't settle the growing anxiety investors have about what it will take to get there. Revenue grew 14% year over year, Oracle's best quarterly growth in nearly three years, but came in slightly under Wall Street's forecast. Oracle also added nearly $68 billion to its revenue backlog thanks to new deals with customers like Meta Platforms and Nvidia, but the already tipped that news during an analyst meeting in October. The biggest surprise turned out to be the wrong kind. Oracle spent a record $12 billion in capital expenditures in the November ended quarter, which was far higher than the $8.4 billion Wall street was expecting. It also boosted its full year capital expenditure forecast from $35 billion to $50 billion. Oracle's downtrodden stock lost another 15% in trading early Thursday morning. An annual capex bill of $50 billion might seem light relative to what some of the other big tech companies are sending out the door, but that equates to 75% of Oracle's projected revenue for the current fiscal year, a staggering amount considering that Capex has averaged about 17% of annual revenue over the past five years. Meta platforms, the most aggressive of the mega cap tech companies in its relative AI investments, is expected to spend about 36% of this year's revenue on capex. Oracle's investment isn't just to service OpenAI. But the ChatGPT owner still accounts for the majority of Oracle's $523 billion in remaining performance obligations, which refers to contracted revenue not yet recognized. And that revenue backlog is nearly nine times the size of Oracle's current annual revenue, a far greater ratio than at competing cloud providers Microsoft, Google and Amazon. Microsoft, which is OpenAI's main computing partner, has a backlog that is only about 1.4 times as large as its revenue over the past four quarters. In other words, most of Oracle's growth over the next few years still depends on OpenAI. That won't be easily diversified, since few others can afford to commit to such sums. And it is far from certain that OpenAI will be able to live fully up to its commitments, especially if AI demand falters overall or ascendant challengers like Google and Anthropic supplant chatgpt's position. In a note to clients last week, Gil Lauria of DA Davidson said Oracle needed to use its quarterly report quote to address concerns about the tricky balance of borrowing money to build out capacity for OpenAI. With the new understanding, there is very low likelihood OpenAI will live up to its obligations. That didn't happen. Oracle now has burned a little over $13 billion in cash over the past four quarters and has about $88 billion in debt net of cash, a sharp contrast to the net cash positions of its big tech rivals. In a report last week, Moody's noted that Oracle has the highest exposure to OpenAI and has the weakest credit metrics among investment grade hyperscalers. Oracle did say Wednesday that it intends to preserve its investment grade rating as it finances the AI build out, but investors are clearly tiring of seeing the money go only one way, end quote.
We all remember the choices that shaped the course of our lives in business. World renowned venture capital firm Sequoia Capital calls them Crucible Moments. Their podcast brings you inside the pivotal decisions that defined some of today's most influential companies. Hosted by Sequoia's Rulof Botha, Crucible Moments Season 3 pulls back the curtain on the untold stories behind companies like Stripe, Zipline, Palo Alto Networks, Klarna, Supercell and more. I loved the recent episode with the founder of Zipline and how even late to the game, they are leapfrogging the bigger players to bring true autonomous drone delivery not just to hospitals, but now to customers of the likes of Walmart and Chipotle. I might win that burrito delivery bet someday soon. Tune in to Sequoia's new season of Crucible Moments to discover how some of the most transformational companies of the modern era were buil. Crucible Moments is available everywhere you get your podcasts and@CrucibleMoments.com.
Keeping pace with data growth in the age of AI is like trying to find enough shelf space after a trip to a big box store. AI and data growth have outpaced the old storage model. Manual management of traditional storage can't keep up, so it's time for a new, unified approach from Pure Storage. They help organizations simplify and automate how data is stored and managed, eliminating silos and putting intelligence at the center of operations. When you don't know where data lives or how it's used, governance slips, visibility and compliance can become constant challenges. The Pure Storage platform unifies storage into a single, intelligent layer that can turn data into a governed, virtualized cloud of data with guaranteed outcomes. Learn more@PureStorage.com Morning Brew that's PureStorage.com MorningBrew.
Wow, two days in a row. This is sort of a trend of letting users have more control over their algos. Spotify says it is testing something called Prompted Playlists, which will let users describe what they want to hear and receive a unique set of songs based on their listening history. Quoting TechCrunch, Spotify announced on Wednesday that for the first time, it's giving users more control over the streaming services algorithm. That's at least how the company is framing the launch of its new Prompted Playlists, a feature that will initially be available to premium subscribers in New Zealand. The feature, which is currently available in English only, is still in beta and will evolve before rolling out to other markets, according to Spotify. The new tool allows users to describe what they want to hear in a personalized playlist that reflects the full arc of their tastes, according to the company. That means the playlist focuses not only on the songs you like now, but your entire Spotify listening history from day one, something that differentiates the playlist from other playlist features. The company says the feature is an evolution from Spotify's existing AI playlist option, which debuted last year and also works through written prompts. As with AI playlists, the new prompted playlists allow users to request what they want to hear with written instructions. However, they can now write much longer prompts with more specific instructions. That's because the new AI feature factors in world knowledge, a rep from Spotify explained to TechCrunch. In addition, the ability to go further back in your listening history and schedule how often the playlist refreshes makes it different from Spotify's other AI playlist offerings. For instance, Spotify suggests subscribers can use the new feature to ask for something like music from my top artists from the last five years, then amend the prompt to include a request for deep cuts I haven't heard yet. In another example of a longer prompt, Spotify said you could ask for high energy pop and hip hop for a 30 minute 5k run that keeps a steady pace before easing into relaxing songs for a cool down, or music from this year's biggest films and most talked about TV shows that match my taste. In addition, you can continue to fine tune the prompt prompt to make it even more specific and can set how often you want it to refresh, like daily or weekly. The idea is that users can essentially make their own version of something like Spotify's flagship playlist discover Weekly, but one that's focused on a type of music genre or time period they'd like to track, or their own version of something like Spotify's genre focused daily mixes. End quote.
Data Centers in Space I told you how that's the hot new buzzword. Here's another example. The Journal says that Blue Origin has worked for over a year on tech for orbital AI data centers and that SpaceX plans to use upgraded Starlink satellites for AI computing payloads. Deploying satellites that provide significant AI computing capability will present difficult engineering hurdles and pose tough questions about the price of deploying swarms of the devices into Advocates acknowledge the challenges of making these systems work, including doing so in a manner that would match the performance of cavernous data centers stuffed with AI chips on the ground. Skeptics believe the technical risks are being underestimated and say space based data centers won't be competitive on cost, especially if power and other constraints ease on the ground. Nonetheless, the idea has seized the imaginations of many leaders working on AI and space technologies. Deploying satellites as data centers, the thinking goes, would allow the AI industry to avoid earthly headaches such as securing the immense amounts of power needed to train AI models. Proponents imagine potentially filling orbits with satellites laden with chips that handle the computations underpinning AI applications used by consumers and companies. Zipping through space, the satellites would tap the immense power of the sun to operate and beam data back to earth. Taking resource intensive infrastructure off Earth has been an idea for years, but it has required launch and satellite costs to come down. We are nearing that point, said Will Marshall, chief executive of Satellite Operator and Build labs. In early 2027. Google and Planet Labs aim to deploy at least two test satellites into orbit carrying the tech giant's AI chips called Tensor processing units. Google has described the project as one of its moonshots. Given the obstacles of deploying a network of satellite data centers at scale, one challenge is the number of satellites that may be needed. Travis Beals, a Google executive working on the orbital data center effort, said it would take 10,000 satellites to recreate the compute capacity of a gigawatt data center, assuming 100 kilowatt satellites. The the test mission in 2027 is about demonstrating the key elements of operating satellites as AI computing clusters, according to Beals. Then we have a long hard road in terms of all the optimization, all the various new technologies we need to scale up and then do so in a cost effective way, he said in an interview. A throng of companies and executives are trying to figure out the viability of orbital data centers in addition to SpaceX, Blue Origin and Google. In October, Jeff Bezos said during an event in Italy that shifting data centers to orbit made sense given the solar power available. It will take time for those to beat the cost of terrestrial AI infrastructure, but he predicted that would happen in 20 years or sooner. Sam Altman, chief executive of OpenAI, has investigated whether his company could take over a rocket operator using the vehicles to deploy AI computing in space, the Wall Street Journal reported. Eric Schmidt, the former Google chief executive who took over Relativity Space, a company working on its own rockets, has talked about orbital data centers. IBM's Red Hat software business and Houston based Axiom Space had a data computing prototype launched in Aug. Aetherflux, StarCloud and other venture startups are setting their own plans to compete against larger players. Operating satellites as data centers will pose a host of technical issues, including managing temperatures for AI chips in orbit, protecting them from radiation, and transferring data back to the planet without long lag times. There's a bunch of engineering challenges, but I think those engineering challenges are all solvable, said Johnny Dyer, chief executive of Muonspace, a satellite company that was involved in a research paper from Google about orbital data centers. It ultimately comes back. The prospect of having potentially thousands of satellites as data centers to launch could drive business across the aerospace supply chain, including rocket companies. Developing rockets is expensive and difficult, but frequent launches would allow operators to offset costs and boost margins, industry executives say. End quote. You know, I think I've made this point before, but this is one of the solutions to the Fermi paradox, that once a civilization gets computing advanced enough, it needs to find a way to cool it at scale. And so the theory is that aliens would go off to the cooler, darker sections of space to house all their computers there, and then maybe their civilization goes with it, because, you know, the metaverse and stuff like that. So that's why we don't see aliens, because they're all in dark, cold, unpopulated areas of the universe.
Finally, today, have you heard about the Model Context Protocol? If you're a developer, you might be about to over the last year and a half, the biggest AI companies have quietly rallied around a shared way for AI agents to plug into the Wider Internet the MCP, or Model Context Protocol. Originally hacked together in 2024 by two anthropic engineers to help Claude talk to the tools people actually use at work, MCP has since been embraced by OpenAI, Google, Microsoft, Cursor, and others, with hints that Apple is circling it for a new Siri. MCP defines which apps, data sources, and workflows an AI model can access, then lets systems talk to each other directly, like when Claude sends a message in Slack and gets a confirmation back. It's similar in spirit to APIs in the Web 2.0 era, but designed for agents instead of human users, and its creators like to compare it to a USB C for AI tools. Now Anthropic is donating MCP to the Linux foundation and together with OpenAI, Google, Microsoft, AWS Block, Bloomberg, and Cloudflare, launching the AgentIC AI foundation to push open standards for agents. With multiple companies already contributing, the hope is that a neutral home and broader input will make agents faster, safer, and more reliable, and that ordinary users will never have to know MCP exists, only that their AI tools suddenly work better. The Verge article I got all that from is much longer and much more detailed than what I just summarized, so if this sounds like something worth getting ahead of, click through in the show. Notes for a deeper dive.
Nothing more for you today. Talk to you tomorrow.
Episode Title: Disney Invests In OpenAI
Date: December 11, 2025
Host: Brian McCullough
This episode focuses on Disney’s landmark $1 billion investment in OpenAI and the licensing of over 200 Disney characters for OpenAI’s Sora platform. The host, Brian McCullough, unpacks the implications for storytelling, media, and synthetic content creation. The episode also covers market implications for AI infrastructure players like Oracle, the emerging trend of data centers in space, Spotify’s latest AI-enabled playlist features, and the rise of open standards for AI agents with the Model Context Protocol.
[01:55 – 05:31]
"Bringing together Disney's iconic stories and characters with OpenAI's groundbreaking technology puts imagination and creativity directly into the hands of Disney fans in ways we've never seen before."
— Bob Iger, Disney CEO ([03:56])
[05:31 – 09:54]
"With the new understanding, there is very low likelihood OpenAI will live up to its obligations. That didn't happen. Oracle now has burned a little over $13 billion in cash over the past four quarters and has about $88 billion in debt net of cash, a sharp contrast to the net cash positions of its big tech rivals."
— Gil Lauria, DA Davidson; Moody’s analysis ([07:36])
[11:43 – 14:11]
"Music from my top artists from the last five years... including deep cuts I haven’t heard yet."
— Spotify Prompt Example ([13:29])
[14:11 – 18:48]
"There's a bunch of engineering challenges, but I think those engineering challenges are all solvable."
— Johnny Dyer, CEO of Muonspace ([17:48])
[18:48 – 20:32]
"It's similar in spirit to APIs in the Web 2.0 era, but designed for agents instead of human users, and its creators like to compare it to a USB-C for AI tools."
— Brian McCullough ([18:48])
Brian demonstrates an upbeat, informed, and slightly conversational tone, quoting directly from key industry sources, and occasionally injecting speculative and humorous reflections to tie together the fast-evolving AI and tech landscape.
This episode is a brisk, insightful roundup of pivotal moves in AI content, enterprise infrastructure, and standards—anchored by Disney’s audacious leap into the world of generative AI and OpenAI investment.