Transcript
A (0:00)
Today on the AI Daily Brief as their for profit conversion closes, OpenAI looks forward to an eventual massive IPO. Before that in the headlines, a big settlement between Universal Music and Udio. But is it good for AI music in general? The AI Daily Brief is a daily podcast and video about the most important news and discussions in AI. Hello friends, quick announcements before we dive in. First of all, thank you to today's sponsors, Blitzy Robots and Pencils, area and Assembly. To get an ad free version of the show, go to patreon.com aidaily Brief or subscribe on Apple Podcasts. And then for all the information about anything surrounding the show, be it sponsorship opportunities, trying to get me to speak somewhere, job opportunities that we have, including one that I am very much looking for around growth, go to aidaily Brief AI and while you're there please take just a minute to do our AI Benchmarking survey. The idea is to figure out which use cases drive the most value and what specific value they drive, and we're going to turn that into massive shared public information that should help everyone be better able to figure out ROI norms and expectations heading into the year. You can add just one use case, you can add 10. And one thing I'm thinking about as we've now seen hundreds of these use cases come in is potentially including some case studies from this group on the show. So if that's something you're interested in, fill out the survey, then shoot me a note. I'd love to potentially talk about it. With that though, let's dive into this interesting markety episode of the AI Daily Brief. Welcome back to the AI Daily Brief Headlines edition. All the daily AI news you need in around five minutes. We kick off today with a bit of a follow up from our show recently about AI Music. One of the things we talked about on that show was that it seemed like the record companies were getting closer to settlements, which had the potential to clear out one of the major overhangs for that sector. Late on Wednesday night, news broke that Universal Music had settled their lawsuit with Udio, setting up the first AI Music licensing system. Under the agreement, Udio will set up a new subscription service next year that allows users to create music based on licensed songs. This will include remixes and customized tracks, but we don't have any further details on how exactly the feature will work. Artists will be paid for the use of their music and training data and will also receive a royalty when their songs are used on the platform to create outputs. Full financial details of the settlement were not disclosed. Michael Nash, Universal's chief digital officer, said, this product is all about allowing fans to utilize AI to have this deep level of engagement. Udo CEO Andrew Sanchez said the new platform will include a social element, allowing users to share their creations. However, he suggested the creations will be locked into the UDO platform and not able to be shared more broadly. Commenting we're talking about songs that are famous potentially, or artists that have recognizable careers and voices. We want to be able to control that environment really well. Yudio's existing platform will continue to be available until the new product launch, but downloads will be disabled immediately, keeping all content locked to the platform. Notably, this settlement is only between Universal and Udio. Sony Music and Warner Music are also parties to the lawsuit, while Suno is another defendant. Court filings were being exchanged as of this week, so the litigation is still ongoing. In an interview with Billboard after the settlement was announced, Udo CEO Andrew Sanchez discussed the concept of the new platform. We're breaking new ground on a market that combines new forms of AI and artist interaction, creation and consumption. We're making a new market here, which we think is an enormous one. I think that we're already incredibly differentiated just today just by saying all of this. While it's never a good idea to assume too much based on the initial reaction from users when there's a big change, the response has not been positive. Ism writes, the new era in music begins by disabling downloads from paying customers with no warning. I pay for something and you deny me access to it. How about a refund? It'll alleviate your heavy heart. Lol. I just found out that I can't download my own music that I create in Udio. It's unfortunate because I often use Udio as a reference for working on instrumental compositions for my own videos, and my samples that I've been working on with Udio are there too. It's really sad, but it looks like this is the end for Yudio. Well, I'm going back to FL Studio and will continue making music by hand. I think the thing to note here is a couple things. First, anyone who has watched the record industry's engagement with technology over the last 25 years had to 100% assume that this is the type of outcome that they would be driving for. Basically open up a new market using AI for their artist to expand fan engagement with the core content. I don't think that that's an uninteresting product. It could be that there are a lot of people who have a ton of fun with that it could be a very expansionary market. It could be exactly the type of thing, in other words, that demonstrates how AI is additive, not subtractive. But at the same time, make no mistake that it is not the same as the breakout creative consumer experience that we were talking about the other day with Suno. Being able to remix Taylor Swift is not the same as being able to generate a song with your kids names in it. It is just in every way a fundamentally different thing. Now I think there might be room for both. The question is just whether the last player standing Suno can find a way to thread the legal needle here. If I had to make a prediction, it would be that the labels effectively split these two companies into two buckets, one to do the artist remix thing, one to do this more broad generation thing. And the announcement that we'll get is with all of them taking a fat chunk of Suno and having it be just another revenue line as they did with companies like Spotify. Next up, character AI will ban users under 18 in a sweeping change to address concerns over child safety. The change will go into effect in late November, but in the interim Character AI will identify which users are underage and put a two hour time limit on their sessions that will decrease over the next month. Instead of open ended conversations, teens will be able to create videos, stories and streams with characters within a new purpose built experience. For underage users, Character AI will use age detection technology to enforce the policy rather than requiring users to upload their id. Character AI CEO Karan Deep Anand said, we're making a very bold step to say for teen users chatbots are not the way for entertainment, but there are much better ways to serve them. Now. Media coverage identified that Character AI basically had no choice but to change their operations in the face of multiple lawsuits and scrutiny in Congress. Earlier this week, Senator Josh Hawley introduced a bill that would ban AI companies from providing AI companions to minors and forced chatbots to remind the user that they're not human. Periodically throughout chat sessions, Character AI said that they were making these changes as a direct result of feedback from regulators, safety experts and parents. Now, by all accounts, Character AI has a substantial teen user base, so this is a massive change for the platform. Writing to those users, the company said, we are deeply sorry that we have to eliminate a key feature of our platform. Many of you have told us over time how important the characters and stories you've created are to you. They continued, we do not take this step of removing open ended character chat lightly, but we do think that it's the right thing to do, given the questions that have been raised about how teens do and should interact with this new technology. Little bit of fundraising News Legal AI startup Harvey has raised a new round at an $8 billion valuation. Sources say that the Andreessen Horowitz led deal brought in 150 million in fresh capital. Now Harvey's valuation keeps ramping up this year with no signs of slowing. They raised at 3 billion in February, followed quickly by a $5 billion round in June. Revenue is also escalating, with Harvey reaching the milestone of 100 million in ARR in August. Now one interesting note is that if you thought the ChatGPT wrapper critique days were past, you something about Harvey continues to inspire conversation around how it's quote unquote, just a ChatGPT wrapper. Vast Company, for example, just this week published a big piece with the subtitle asking is it really any better than ChatGPT? Now? I think, broadly speaking, that we're sort of heading into a product era of AI, which is a concept that I'm going to explore more. So I'm keeping an eye on this particular conversation. Finally, let's round out with some new feature launches. Cursor has announced Cursor 2.0 and debuted Composer, which is their first proprietary coding model. The model is optimized for Speed, with Cursor claiming it completes most interactions in less than 30 seconds while maintaining strong reasoning capabilities across large code bases. They claim the model was four times faster than current frontier models. It's also trained for agentic workflows, supporting the ability to code autonomously through planning, programming and testing phases. In addition to the new model, Cursor has overhauled the interface to allow for multiple agents to be run in parallel. The interface is also designed to make it easier to run the same prompt against multiple different models and take the best result. A built in browser has been added for simple unit testing, and finally the platform adds voice mode to enable speech to code functionality. Interestingly, Cognition and Windsurf have also announced their new coding model, SUI 1.5. Like cursor's model, speed is once again the focus. Cognition partnered with Cerebras to serve the model at 950 tokens per second, which is six times the pace of Claude Haiku 4.5 and 13 times faster faster than Sonnet 4.5 on the SUI Bench Pro benchmark. SUI 1.5 slots in between those two models, making it close to the state of the art in performance. Cognition's Andrew Gao writes Don't write this off as fast. Non Frontier Lab model equals dumb and not worth my time. It's smarter than the SOTA models were this summer, and also way faster. Now if it's interesting to you that the Agent Labs are becoming Model labs, I talk a bunch more about this with Swix from Latent Space, who also, by the way, recently joined Cognition in a show that'll come out next week. But for that we're going to have to wait a couple days because today that's going to do it for the headlines. Next up, the main episode. This episode is brought to you by Blitzy, the enterprise autonomous software development platform with infinite code context. Blitzy uses thousands of specialized AI agents that think for hours to understand enterprise scale code bases with millions of lines of code. Enterprise engineering leaders start every development sprint with the Blitzi platform, bringing in their development requirements. The Blitzi platform provides a plan, then generates and pre compiles code for each task. Blitzi delivers 80% plus of the development work autonomously while providing a guide for the final 20% of human development work required to complete the sprint. Public Companies are achieving a 5x engineering velocity increase when incorporating Blizzi as their pre IDE development tool, pairing it with their coding pilot of choice. To bring an AI native SDLC into their org, visit blitzi.com and press get a demo to learn how Blitzi transforms your SDLC from AI assisted to AI Native AI isn't a one off project. It's a partnership that has to evolve as the technology does. Robots and pencils work side by side with clients to bring practical AI into every phase. Automation, personalization, decision support and optimization. They prove what works through applied experimentation and build systems that amplify human potential. As an AWS Certified Partner with Global Delivery Centers, Robots and Pencils combines reach with high touch service where others hand off, they stay engaged because partnership isn't a project plan, it's a commitment. As AI advances, so will their solutions. That's long term value. Progress starts with the right partner. Start with Robots and pencils at robotsandpencils.com aidaily Brief There's a reason most enterprise AI initiatives never make it to production. You can't find a platform that's both powerful and secure enough. The result? AI budgets burn with zero business impact. But not anymore. ARIA is the enterprise AI platform that delivers speed without compromise. Unlike other platforms that force you to choose between fast deployment or secure operations, ARIA brings speed and security together launch AI quickly without cutting corners on compliance scale rapidly without sacrificing governance move at the speed of business without moving past your security requirements. Fortune 500 companies across finance, healthcare, retail, legal and more choose Aeria because they deliver what seemed impossible enterprise AI that's fast enough to beat the competition and secure enough to protect your most sensitive Data. Ready for AI at full speed with zero compromise. Visit aeria.com to see the platform in action. That's airia.com Simplify Enterprise AI if you're building anything with Voice AI, you need to know about Assembly AI. They've built the best speech to text and speech understanding models in the industry, the quiet infrastructure behind products like Granola, Dovetail, Ashby and Clulee. Now, as I've said before, voice is one of the most important modalities of AI. It's the most natural human interface, and I think it's a key part of where the next wave of innovation is going to happen. Assembly AI's models lead the field in accuracy and quality, so you can actually trust the data your product is built on. And their speech understanding models help you go beyond transcription and uncovering insights, identifying speakers and surfacing key moments automatically. It's developer first, no contracts, pay only for what you use and scales effortlessly. Go to semblyai.com brief, grab $50 in free credits and start building your Voice AI product today. Welcome back to the AI Daily Brief. Today we are talking all about AI in markets. We have the Fed Chair talking about it. We're in the middle of earnings season and all of that is of course contributing to more AI bubble talk. And late last night we got the news that hot on the heels of their for profit conversion, OpenAI is starting to prepare for a trillion dollar IPO, which would make it one of the largest IPOs of all time. The news came on Wednesday evening and was broken by Reuters. Citing three sources familiar with the matter. Reuters reported that OpenAI was considering filing the relevant paperwork either in the second half of next year or more likely at this point in early 2027. The sources said that preliminary discussions had the company raising at least $60 billion and likely much more. Those sources caution that these are very early stage talks, so the numbers and timing could change based on the company's growth in the market. An official spokesperson downplayed the news but didn't deny it. An IPO is not our focus, so we could not possibly have set a date. We are building a durable business and advancing our mission so everyone benefits from AGI. In my strong estimation, based on the reading of these reports, I think it's quite clear that OpenAI knows that they need to go public to access that capital pool, not just for themselves, but for this increasingly dense network of deals around them. Indeed, as we think about takeaways from this news, one of them is that this was completely inevitable given how fast the company has been exhausting potential sources of capital. At the end of last year, their raises were starting to tap out of the venture ecosystem, leading Sam Altman to go look farther afield to wealth fronts in the Middle East. This year their major deal was with SoftBank, and that $30 billion raise seemed to stretch Masa's son to his absolute limit. OpenAI expects to burn 8.5 billion this year, and that doesn't seem to include a lot of their infrastructure. Capex, the company will likely need to burn through all of the $40 billion they raised this year to fund operations and their ambitious infrastructure build out. They'll almost certainly be able to raise more money in the private markets to get them through next year, but if they need to start raising money in 50 or 60 billion dollars chunks, then the public markets are the only way to access that kind of capital. Sam Altman even acknowledged this during the live stream about the For Profit Conversion on Tuesday, stating, I think it's fair to say it is the most likely path for us given the capital needs that we'll have now. This is not going to be a normal IPO by any stretch of the imagination. There are currently 11 companies in the entire world worth a trillion dollars or more. Berkshire Hathaway makes the cut, but JP Morgan Chase, Walmart and Tencent are all below the threshold. In other words, OpenAI will enter markets as one of the largest companies in history for comparable IPOs. There's really only Saudi Aramco, which debuted at $2 trillion in 2019. They only raised 25 billion and that remains the largest ever public fundraising. While OpenAI looks set to attempt an IPO at almost three times as large, it's pretty difficult to imagine that they'll struggle raising that much money. Holding aside the mass of retail investors, the stock is going to be on the buy list for every major institution, pension fund and ETF in the world. Now, outside of the obvious importance for OpenAI, I think this is incredibly important for the larger AI industry and just to recalibrate expectations of the relationship between retail investors and public markets. While it's not the only reason or even the top reason that people have fallen out of love with tech, the fact that companies have decided to stay private longer and longer, accessing Series GHK LMNOP type rounds that were never available before has structurally blocked out retail investors from participating in the incredible wealth creation of technology over the last decade and a half. That's not to say that if people could have had access to these companies, they wouldn't still have issues with the way social media has divided and fractured us or any of the other numerous issues that have come along with the modern technology paradigm. But it certainly isn't helping. We are in a moment where sentiment is turning against the AI labs and the AI industry more broadly. And as every story comes of OpenAI going from a $29 billion valuation at the beginning of 2024 to a $500 billion valuation now, even retail investors who are completely sold on the theme were blocked out of participation. The biggest AI IPO has been core weave, and that's certainly not a pure play on frontier AI models. So sure, if you've been in Nvidia, Oracle or Broadcom, it's good for you. But that's not broad based participation in this key theme. Putting OpenAI and the other companies that should follow its lead into pension funds and retirement accounts gives people in society more broadly a lot more buy in as the technology develops and wealth gets created. Maybe, to put it more simply and at the risk of getting a little more political than the show normally goes, you better believe that with where public sentiment is and the disillusionment with capitalism that is starting to run rampant, there is going to be some sort of wealth redistribution that happens over the next decade. I don't know about you, but for me I'd love to see some meaningful part of that be in the form of people having access to the equity upside of these transformational companies. So my vote OpenAI Sami A if you're listening, get public and do it fast. You're already dealing with the entire world watching your every move and yelling at you anyway. You might as well get the funding benefits that could come alongside that. Let's move on to some other parts of the market conversation though, from yesterday. This week saw the Federal Reserve's FOMC meeting In the subsequent press conference and during that press conference, Fed Chair Jerome Powell talked a little bit more than he has so far about the state of AI. Comparing it to the dot com bubble, Powell said, I won't go into particular names, but they have actual earnings. These companies actually have business models and profits and that kind of thing. So it's a really different thing now. The press seemed to be looking to get Powell to signal that the AI buildout had gotten way out of control, basically questioning whether the Fed should raise rates to cut off excessive AI spending. Powell tamped down on that pretty quickly, saying, I don't think interest rates are an important part of the AI or data center story. It's based on longer run assessments that this is an area where there's going to be a lot of investment and that's going to drive higher productivity. He also noted that it isn't his job to weigh in on the stock market, commenting, we don't look at any one asset price and say, hey, that's wrong. It's not our job to do that. We look at the overall financial system and ask if it's stable and whether it can withstand shocks. To that point, he noted that banks are well capitalized, there's not much leverage in the financial system, and that it's, in his words, not an overly troubling picture. In other words, Meta and Microsoft can drop hard on disappointing earnings, as we'll see, and it turns out the financial system won't come to an end. Now, before we get into Meta and Microsoft, we have to talk about the world's first $5 trillion company, Nvidia. On Tuesday, Nvidia had a ton of new announcements at their developer conference. Jensen Huang unveiled a new networking solution to scale up quantum supercomputers, a deal to build seven new supercomputers for the US Department of Energy, and huge new partnerships with Samsung, Uber, Hyundai and Nokia. Bloomberg suggested that the long list of new partners was intended to dispel fears that the AI bubble is bursting. Frankly, though, even more so than new business, one of the most striking reveals at the event was just how hot Nvidia's core business remains. Huang boasted that his company has half a trillion in backlogged orders for their latest generation of AI chips. That backlog will run all the way through into 2026. In other words, Nvidia doesn't need to do anything other than deliver on their core product to have one of the most successful years in global corporate history next year. Nvidia's previous best was $130 billion in revenue for their fiscal year of 2025, which ended in January. Wall Street's lofty forecasts were only expecting 380 billion in revenue through to the end of next year, so there's potential for a 30% outperformance, Huang told the assembled crowd. We have now reached our virtuous cycle, our inflection point. He said that Nvidia expects to ship 20 million Blackwell chips five times as many units as the entire run of Popper architecture that began in 2022. Battling the narrative In a Bloomberg interview following his keynote, Huang said, I don't believe we're in an AI bubble. All of these different AI models we're using, we're using plenty of services and paying happily to do it. And for the bubble watchers out there, this has to be an important part of the conversation. Despite a hundred billion dollar round trip deal with OpenAI, Nvidia still has multiples of that in customers paying in cash. All in all, the Stock was up 5% on Tuesday and is now up 9% on the week, making the company the first ever to reach 5 trillion in market cap, which makes them larger than the GDP of every country in the world now except the US and China. Wedbush analyst Dan Ives remarked, Nvidia's chips remain the new oil or gold in this world for the tech ecosystem, and there's only one chip in the world fueling this AI revolution, and it's Nvidia. Now what about the Hyperscaler earnings? We got earnings reports from Alphabet, Google, Meta and Microsoft. And Bloomberg TV's Caroline Hyde summed it up like this Microsoft revenue beat Cloud growth beat capex hiked stock lower Meta revenue beat Net income Hit by tax charge Capex to be hiked Stock stock lower Google revenue beat Cloud growth beat capex hiked stock higher let's talk about Google first. As you just heard, the company reported a massive beat. They recorded their first ever $100 billion quarter in revenue, up 16% year over year. Google Cloud revenue was up 34% compared to last year, reaching $15 billion. Ad revenue rose 12.6% to reach $74 billion. This extremely strong result allowed Google to boost their CapEx forecast with confidence, raising this year's spend to between 91 and 93 billion. Said CEO Sundar Pichai, we're investing to meet customer demand and capitalize on the growing opportunities across the company. Separately, we got another indication of the massive growth in use of Google Gemini. Whereas between March and July of this year, monthly active users grew from 350 million to 450 million, between July and October, monthly active users jumped from 450 million to 650 million, a massive shift up. They also saw Daily requests increase 3x quarter over quarter. Part of what market analysts are starting to note is that some of the early narratives just aren't coming true. Matt Stuckey, the chief portfolio manager at Northwestern Mutual, said continued strength in search is helping to dispel the negative sentiment surrounding AI's potential impact on Google's biggest businesses. Overall, Google stock was up 6.5% in extended trading. Meta and Microsoft had a bit of a tougher day. Meta reported a $15.9 billion income tax bill following law changes implemented in July that wiped out almost all of their net profit of 18.6 billion. Revenue was a slight beat compared to forecasts coming in at 51 billion. Still, for investors, the big headline was that Meta are once again accelerating their AI infrastructure spending. For this year, they raised their guidance from between 66 and 72 billion to between 70 and 72 billion next year. Didn't have a number attached, but they guided a significant increase. Mark Zuckerberg told investors, there's a range of timelines for when people think that we're going to get superintelligence. I think that it's the right strategy to aggressively front load building capacity, so that way we're prepared for the most optimistic cases. Still, aside from unclear timelines, another problem for Meta is that their ability to demonstrate a return on all this AI spending is a little bit murkier than, for example, the cloud revenue that someone like Google can report. Meta continues to claim that growth in advertising revenue is due to AI, but it's a little bit blurrier and in general that growth is softening as well. The story from Microsoft's earnings was that despite record spending on data centers, the company is still capacity constrained. The Azure division performed well, growing at 39% year over year, which slightly beat Wall street forecasts. But Microsoft spent 34.9 billion on capex last quarter, up from around 10 billion the previous quarter, and still can't seem to catch up with demand. Microsoft CFO Amy Hood, who is known for her capital discipline, didn't seem to have a plan to address this issue. She didn't offer a specific forecast for CapEx next quarter or next year, simply stating the company's spend would, quote, increase sequentially and we now expect the fiscal year 2026 growth rate to be higher than fiscal year 2025. Both Meta and Microsoft fell in after hours trading, with meta diving by 8% and Microsoft losing 4%. The name of the game for hyperscalers is now marrying CapEx and ROI. One of the ways to interpret what happened yesterday was that Meta was punished for failing to show ROI and Microsoft was punished for failing to do enough capex. Google, meanwhile, ticked both boxes and the stock soared. David Lefkowitz, the head of US equities at UBS Global wealth, said the market is willing at a minimum to take a longer term perspective. Right now we're in this period where the market is rewarding capex. Traditionally the market has been skeptical of that. On the flip side, as we're seeing, it does really need to be paired with real performance. Today is not a bubble show day. But whatever you think about whether we are or are not in a bubble, if you are someone who is concerned that it might turn into a bubble, I think the meta results in specific should be encouraging. While investors are clearly incredibly excited about the future, they do want to see that even now, early on there is some return on investment or at least that there is a clear path to roi. Now we'll see how that holds. But for now a very interesting demonstration of where things are at the moment. That's going to do it for today's AI Daily Brief. Appreciate you listening or watching as always. Until next time. Peace.
