Transcript
A (0:04)
Welcome to the Tech Boo Ride home for Thursday, November 6th, 2025. I'm Brian McCullough. Today, Apple is probably going to let Gemini power the new Siri, at least for a while. Does OpenAI want the government to backstop its AI buildout and two new AI products? How about an AI smart ring to remember your shower thoughts and what if Foursquare but for the AI era, here's what you missed today in the world of tech if you yes, you are looking for enterprise grade identity automation minus the enterprise grade baggage, aka having your users log on 500 times. Yeshid delivers advanced IAM automation without moving teams onto a legacy identity provider. Whether you use Google Workspace, Microsoft 365 or Okta, Yeshid integrates directly. No rebuilds or RIP and replaces are required. Yes, ID helps IT and security teams reduce risk, not just tickets. And so IT teams everywhere are breathing a collective sigh of relief. Every access change, review and approval is tracked and exportable, helping security teams effortlessly demonstrate compliance with SOC2, ISO or HIPAA. IT and security teams can spot risk before it becomes a finding. Learn more@yeshid.com Techbrew that's Y-E-S-H I-COM Techbrew Mark Gurman is reporting that Apple is finalizing a deal to pay Google around a billion dollars a year for a 1.2 trillion parameter Gemini model to help power the new Siri, which is on track for next spring. Quoting Bloomberg, the iPhone maker is banking on Google's help to rebuild Siri's underlying technology, setting the stage for a new slate of features next year. The Google model's 1.2 trillion parameters, a measure of the AI software's complexity, would dwarf the level of Apple's current models. Apple had previously mulled using other third party models to handle the task, but after testing Gemini, OpenAI's ChatGPT and Anthropic's Claude, Apple zeroed in on Google earlier this year. The hope is to use the technology as an interim solution until Apple's own models are powerful enough. The new Siri is on track for next spring, Bloomberg has reported. Given the launch is still months away, the plans and partnership could still evolve. Apple and Google spokespeople declined to comment. The custom Gemini system represents a major advance from the 150 billion parameter model used today for the cloud based version of Apple Intelligence. The move would vastly expand the system's power and its ability to process complex data and understand context, known internally as Glenwood. The effort to fix Siri with a third party model has been led by Vision Pro headset creator Mike Rockwell and software engineering chief Craig Federighi. The new voice assistant itself, planned for iOS 26.4, is codenamed Linwood. Under the arrangement, Google's Gemini model will handle series summarizer and Planner functions, the components that help the voice assistant synthesize information and decide how to execute complex tasks. Some Siri features will continue to use Apple's in house models. The model will run on Apple's own private cloud compute servers, ensuring that user data remains walled off from Google's infrastructure. Apple has already allocated AI server hardware to help power the model. While the partnership is substantial, it's unlikely to be promoted publicly. Apple will treat Google as a behind the scenes technology supplier instead. That would make the pact different than the company's Safari browser deal, which made Google the default search engine. The agreement is also separate from earlier talks about integrating Gemini directly into Siri as a chatbot. Those discussions came close to fruition in both 2024 and earlier this year, but ultimately didn't materialize into a feature. The partnership also doesn't weave Google AI search into Apple's operating system. Apple still doesn't want to use Gemini as a long term solution despite the company bleeding AI talent, including the head of its models. Team Management intends to keep developing new AI technology and hopes to event replace Gemini with an in house solution, the people said. To that end, the company's models team is working on a 1 trillion parameter cloud based model that it hopes to have ready for consumer applications as early as next year. Apple executives believe it can reach a similar quality level as the custom Gemini offering, but Google continues to enhance Gemini and so catching up won't be easy. OpenAI Chief Financial Officer Sarah Fryer says an IPO is not on the cards right now, but that OpenAI hopes the US government will quote backstop financing of its data center deals, quoting the Journal Speaking at the Wall Street Journal's Tech Live conference, Fryer threw a dose of cold water on what could become one of the largest public listings in history. She said the AI giant's conversion to a new structure doesn't portend an imminent public offering as the company prioritizes growth and R and D over profitability. IPO is not on the car right now, fryer said. We are continuing to get the company into a state of constantly stepping up into the scale we are at, so I don't want to get wrapped around an IPO axle. The company has discussed a public listing as soon as 2027. The Wall Street Journal reported. As OpenAI ramps up its spending on data center capacity to unheard of levels, the company is hoping the federal government will support its efforts by helping to guarantee the financing for chips behind its deals. Fryer said the depreciation rates of AI chips remain uncertain, making it more expensive for companies to ra the debt needed to buy them. This is where we're looking for an ecosystem of banks, private equity, maybe even governmental, the ways governments can come to bear, she said. Any such guarantee can really drop the cost of the financing but also increase the loan to value so the amount of debt you can take on top of an equity portion. Fryer said OpenAI could reach profitability on very healthy gross margins in its enterprise and consumer businesses quickly if it weren't seeking to invest so aggressively. I'm not overly focused on a break even moment today, she said. I know if I had to get to breakeven, I have a healthy enough margin structure that I could do that by pulling back on investment, Friar said the company is trying to find new ways to grow sales beyond its ChatGPT subscription business. She emphasized the growth in OpenAI's enterprise sales, saying that they now account for roughly 40% of revenue, up from 30% at the beginning of the year. Many corporate customers are now moving from pilot to full production, she added, citing sectors such as financial services and healthcare. Fryer said the company would end the year with about 2 gigawatts of computing power to help train and power its AI models, up from 200 megawatts two years ago. It rents much of that capacity from Microsoft, Oracle and core weave. OpenAI is also building its own data centers and recently signed a giant deal with Nvidia to lease the chips needed for the effort. As part of the deal, Nvidia is discussing guaranteeing some of the loans OpenAI plans to take out to build new data centers, the Journal reported. We are talking about country sized deployments, she said, about how much computing power OpenAI intends to bring online, end quote. Also possibly related to that same conference, Microsoft AI CEO Mustafa Soliman has laid out that company's plans to develop AI self sufficiency beyond OpenAI, like releasing its own voice, image and text models. Quoting the Journal, While he praised OpenAI and the work the companies have done together, he offered a criticism of treating AI systems as though they have human like feelings or rights. AI chatbots shouldn't trick people into thinking they are having conversations with sentient beings, he said. Suleyman noted Microsoft's focus on powerful software tools that can help people accomplish their work improve medical diagnoses and play a role in scientific breakthroughs that will offer the world plentiful clean, renewable energy. AI is going to become more human like, but it won't have the property of experiencing suffering or pain itself and therefore we shouldn't over empathize with it, suleyman said in an interview. We want to create types of systems that are aligned to human values. By default that means they are not designed to exceed and escape human control. Microsoft's Copilot chatbot relies heavily on OpenAI, but Microsoft is building, testing and releasing its own voice, image and text models. Microsoft has access to OpenAI's models until 2032, a schedule Suleyman said gives his team time to make its models into leading technology. Healthcare is a priority for Microsoft AI and one of the first industries Suleyman expects to be touched by superintelligence. The company recently struck a partnership with Harvard Health to provide trustworthy responses in Copilot and rolled out other features to find doctors based on location, language and other preferences. It also developed an AI tool that it said diagnosed disease in a test at a rate four times more accurate than a group of doctors at a fraction of the cost. It is these sorts of instruments that prove AI can be powerful in non conversational ways, Suleyman said. AI diagnostic tools are very close to being market ready, he said. Microsoft's models will be built with containment in mind, including probing and testing the models to ensure they only communicate in a language that humans understand and designing systems to avoid appearing as if they are conscious, he said. End quote. Is the daily commute making your muscles feel stiff? Well, there could be a natural way to help relieve that discomfort. Cornbread Hemp creates premium USDA organic full spectrum CBD gummies designed to help with stress, body aches and sleep. Their products are made exclusively from the hemp Flourish, the most potent part of the plant. For maximum purity and effectiveness. Every batch is third party lab tested to ensure quality, safety and consistency. In short, Cornbread Hemp Gummies are formulated to work with your body, not against it. Right now Techbrew Ride Home listeners can save 30% on their first order. Just head to cornbreadhemp.com brew and use code BREW at checkout. That's cornbreadhemp.com Brew and use code Brew if your startup needs a little extra support. And let's be honest, who doesn't need a little help now and then, then you're in for a pleasant surprise with Fidelity Fidelity Private Shares helps early and growth stage companies stay investor ready with cap table, data room and scenario modeling all in one place, a messy or missing cap table might not just slow you down, it could cost you your next fundraising round. VCs are flooded with pitches, and if your equity is confusing or missing, they'll move on fast. Fidelity Private Shares gives founders the structure and simplicity to focus on what actually building your company. If you stay investor ready, you don't have to get investor ready. Check out fidelityprivateshares.com techbrew to learn more. That's fidelityprivateshares.com TechBrew Google says Ironwood its seventh generation TPU will launch in the coming weeks and is more than four times faster than its 6th gen TPU. That's something quoting CNBC. The chip built in house is designed to handle everything from the training of large language models to powering real time chatbots and AI agents in connecting up to 9,216 chips in a single pod. Google says the new Ironwood TPUs eliminate data bottlenecks for the most demanding models and give customers the AB to run and scale the largest, most data intensive models in existence. Google is in the midst of an ultra high stakes race alongside rivals Microsoft, Amazon and Meta to build out the AI infrastructure of the future. While the majority of large language models and AI workloads have relied on Nvidia's graphics processing units, Google's TPUs fall in the category of custom silicon, which can offer advantages on price, performance and efficiency. TPUs have been in the works for a decade. Ironwood, according to Google, is more than four times faster than its predecessor and major customers are already lining up. AI startup Anthropic plans to use up to 1 million of the new TPUs to run its Claude model, Google said. Alongside the new chip, Google is rolling out a suite of upgrades meant to make its cloud cheaper, faster and more flexible, as advised with larger cloud players Amazon Web Services and Microsoft Azure. In its earnings report last week, Google reported third quarter cloud revenue of $15.15 billion, a 34% increase from the same period a year earlier. Azure revenue jumped 40% while Amazon reported 20% growth for AWS. Google said it signed more billion dollar cloud deals in the first nine months of 2025 than in the previous two years combined. To meet soaring demand, Google upped the high end of its forecast for capital spending this year to $93 billion from $85 billion. We are seeing substantial demand for our AI infrastructure products including TPU based and GPU based solutions, CEO Sundar Pichai said On the earnings call. It is one of the key drivers of our growth over the past year and I think on an overall on going forward basis, I think we continue to see very strong demand and we are investing to meet that end quote. Finally today, two new AI focused consumer products for you. First, Sandbar has unveiled a $249 plus stream ring, an AI powered smart ring for transcribing audio notes into text via an app. Available to pre order shipping in the summer of 20. Dunbar wants to capture the thoughts that usually vanish between subway stops and showers and the like. The Stream Ring is an AI powered smart ring designed to turn fleeting inner monologues into organized notes and conversations. Founded by alumni of Control Labs and later Meta, the company has raised $13 million and is emerging after two years in stealth. Worn on the index finger, Stream Ring activates with a tap and hold on a capacitive edge, speak or whisper into it. The audio isn't stored. Instead, speech is transcribed to text in the Stream app. A quick tap can mute any AI response and the hardware is slated to be waterproof at launch. The Ring also doubles as a media controller. Tap to play or pause, double tap to skip swipe for volume, ensuring basic utility even if back end services ever go dark. Health tracking isn't currently included. Pre orders are open now. There's a silver version at $249 and a gold one at $299 with a sizing kit. Purchases include three months of stream Pro for unlimited notes and chats afterward. The plan is doll per month with a limited free tier available. Shipping is targeted for summer of 2026. Access to the app currently requires the ring, though that may change functionally. Stream apparently sits between a notes app and an AI chatbot. Users can ask for, say, recipe ingredients, auto build shopping lists, get reminders in store with haptic configurations and snapshot ideas into organized notes. It draws on multiple large language models with a blend of on device, phone and cloud processing. Responses can speak back in a near clone of the user's voice via elevenlabs. Alternate voices are available. Apparently early usage skews toward longer back and forths at around 60% of back and forth, with the remainder split between note creation and single queries. Battery life is reported to be all day, with desktop access, sharing and timebase reminders planned. Sandbar frames the product as a low friction self extension that prioritizes user control and safety. And FourSquare founder Dennis Crowley's new startup Hopscotch Labs has released Bee Bot, which combines AI, audio and location based social features on the App Store, Quoting and gadget. Yes, it's another location based social app, but rather than the check ins Crowley first Popularized more than 15 years ago, Beebot has a very 2025 take on the concept. Instead, the app is an AI powered DJ that can deliver contextual audio updates to your ears as you move throughout your day. Crowley describes Bee Bot as an app for AirPods, though it will work with any type of headphones as well as smart glasses with audio capabilities like metas. Whenever you put your AirPods in, it turns on, crowley explains in a post on Medium. Whenever you take your AirPods out, it turns off, and when it's on, it'll push you snippets of audio about the people, places and events that are nearby. To do, you'll need to give the app access to your location and share a handful of keywords about your interests. You can also share your contacts to get updates from friends who are using the app. The Bee Bot dj, which of course has an AI voice, will then be able to talk to you as you go throughout your day and alert you to interesting events, landmarks or updates from friends who happen to be nearby. In some ways, it sounds like Crowley is trying to recreate some of the serendipitous IRL social interactions enabled by the original version of Foursquare. B Bot doesn't have mayorships badges or any of the gamification features that help helped popularize Foursquare, but it's meant to have some of the same playful spirit of OG Foursquare. According to Crowley, Foursquare shut down its Citiguide app of the same name earlier this year, though its check in app Swarm lives on. And because it's 2025, there's also a whole bunch of AI thrown in, including a mix of different LLMs and synthetic voices. The app is powered by a TikTok style algorithm, Crowley says, but one that's focused on what's happening nearby and in real life. Speaking of Denis and irl, I only met Dennis in real life this week. That's despite me having interviewed him for the Internet History podcast, which you'll hear soon, and him being an LP in the Ride Home AI fund for years now. Thank you betaworks for putting that IRL connection together. Talk to you tomorrow.
