Transcript
A (0:00)
It is official. Apple has selected Google Gemini to power all of their new AI features, including Siri. This was after a massive flop where Apple announced Apple Intelligence would be coming soon. And then after they sold all the iPhones that were apparently, you know, set up to have this Apple Intelligence, they kicked the ball and said, or kicked the can and said, actually we're going to move the launch of Apple Intelligence one year into the future. And while they had a couple little features like text summary snippets which were basically just as long as the original text, but sometimes had like really crazy summaries on them that people thought were ridiculous, there wasn't really a lot of Apple Intelligence. And all of the things they promised where Apple Intelligence would be able to take control of your phone and click around and do things, none of that actually shipped. In the meantime, we've had dozens of companies including anthropics, Claude and OpenAI and many others ship absolutely incredible versions of this where you have AI taking control of your web browser, you have Perplexity's Comet and essentially this technology is like very possible, it's very doable, it's out there. Other companies are doing it. Apple never was able to execute on it. And what's worse is that they actually promised they would. I purchased an iPhone that was, you know, that had Apple Intelligence enabled or it was, you know, compatible with that. Nothing happened. I purchased a new iPhone also the same thing. And we still haven't got that. But we know that this year is the year that Apple Intelligence is supposed to come out apparently. And so it looks like they have just went and selected Google Gemini who should be able to power these, these, this functionality. And that's going to be what powers Apple Intellig. This is a multi year deal that Google is going to essentially that they're making with Google to use the Gemini model and then, and they're also using Google's cloud infrastructure. So this is basically the backbone for their Apple foundational models. And this is kind of the company's internal name for the foundation layer that will power future Apple Intelligence features, including a long delayed, this long delayed Siri update that we've been talking about. So this is what they said about it on their blog. They said after careful evaluation, we determined that Google's technology provides the most capable foundation for Apple foundation models and we're excited about the innovative new experiences it will unlock for our users. Google and Apple both kind of did this joint statement and they said that I think there's quite a lot of confirmed rumors with this, this is, you know, something that I've been talking about for months, many people have been talking about for months. We've been reporting on that. Apple's kind of shopping outside of just their internal team to try to find some outside model partners. They're definitely trying to apparently continue building their own, you know, device and private cloud stack but they're not there. And so you know, with them being over a year delayed, they're just gotta go find someone else. So Bloomberg previously reported that Apple was circling a roughly 1 billion a year arrangement which was going to be tied to Google Gemini Access. They'd apparently tested a bunch of different options including OpenAI and anthropic. So it's interesting that they landed on Google. Apple and Google didn't disclose the financial terms in the statement that they released of you know, what that actually looks like. But I think the structure matters more than kind of the sticker price. Basically we know that it is a multi year commitment. It gives Apple access to a frontier model family so it can plug this straight into Apple intelligence. It also is keeping the room, you know, it's essentially keep leaving space to add other providers later. Right. We already know they talk to OpenAI and Anthropic in the future. Theoretically they could have all these companies bid against each other to see who powers Apple. So sources that are kind of covering the deal say that this is not exclusive which means Apple can root other, you know, some of their queries to OpenAI for example, if it was, you know, cheaper saved them money or you know, Anthropic if it was more capable at completing a certain task. So you know, Gemini seems to be the default and the main thing they're using here but they can use others because of that non exclusivity. I think it's important to remember that Apple has, you know, already shown that it is willing to mix and match. In an earlier Apple Intelligence rollout Siri could hand off certain requests to Chat gp. So if you talk to and honestly I never use it because Siri is just a flaming pile of garbage in my opinion. Don't, you know, don't hate me. I have a MacBook and an iPhone so I'm in the Apple ecosystem. I just really don't think it's useful for anything. But they did have a feature where when you ask it questions, if it didn't know the answer just shoot it over to ChatGPT. So this functionality is built in, you know, if you needed some sort of broader world knowledge or a more expansive generative answer. It could do that under the new arrangement that they have. I think the, the center of gravity is essentially shifting towards Gemini is which with all of the other models potentially becoming optional layers rather than the default brain. So like Gemini is going to really power it. Where they used to hand things off to OpenAI or ChatGPT, it will now go off to Gemini. I think Apple's pitch since it introduced Apple Intelligence has been not very consistent. But to them they say it's consistent. They're trying to basically make AI feel like a built in feature of the operating system, not a separate destination app. That's the pitch that they've had. The problem is when they haven't been able to achieve that, they've just basically canceled it or kicked it down the road or said okay, it's delayed a year. Which I don't think is very acceptable for a lot of users that purchase the devices expecting to be able to use the features that they advertise and announce. But regardless, maybe I'm just a little bit salty. I think because of this that means that there's a lot of really small practical upgrades that they have to make to actually make this work. They have to make smarter photo search and notification summaries and they have to pair this with. Yeah, there's a big emphasis on privacy and data minimization. The thing is, Google's already done all of this for Android. I don't have an Android device, uh, maybe that'll be the next one I buy. But Google has a lot of this integrated which is really interesting because Google's like basically been piloting and doing this for the last two years and now Apple is going to rely on Google to do it on their own ecosystem. So it's going to be interesting to see if that's basically a long term relationship. This joint statement that they both Apple and Google put out really doubles down on the framing. They say that Apple Intelligence is going to continue to run on device when possible and via Apple's private cloud compute. This is something cool and interesting. I will give Apple some flowers and stop roasting them for once. If they can really make these models run on the device without needing access to the Internet, this could be really cool. You could be off the grid. You could be somewhere in the middle of nowhere with no Internet access and it could be useful now in that situation. I will also say with Starlink, basically I don't think in the next few years anyone's going to have no Internet access. I think you'll have it basically Anywhere. But if you. But yeah, I think this is kind of an interesting pitch or feature that Apple could kind of show off, but when it does need more horsepower, when it does need more knowledge, it can go and kick stuff over to Gemini. So this should be interesting. I think this is a really big cultural shift. Apple's brand is vertical integration. They own the full stack from silicon to software. And historically they've really resisted dependencies that could shape the user experience. But AI has definitely become something that is. You know, even the most vertically integrated companies are basically forced to pick their battles, right? Like, is it really worth developing our own AI model or should we just use someone else's? I think Apple 100% could have and should have, and they didn't. But, you know, regardless of what happened, this is where we are today. I think Apple can keep its user interface, they can keep their routing logic, they can keep their privacy layer, they could have some of their, like, kind of ghetto models on device, but they can still outsource a big chunk of the frontier model capability to one of these companies that is, you know, like Google is putting billions of dollars into training and infrastructure every single year of Apple's not keeping up. And if they don't have the talent and team or the capabilities, then they just kind of have to outsource that. But what's interesting about this is, like a lot of the tasks that you do on the iPhone don't need the latest and greatest AI model. Like, it doesn't need the greatest reasoning to do, like a photo search, for example. But for some things, you definitely do need that. And so it's interesting because it feels like Apple is going to save a lot of money by doing some of these on dev things and, you know, increase the privacy, save money, maybe increase the speed. And so I do think that that is going to be an interesting value add. And then it's only for like the most complex problems that they'll have to outsource the brain, basically. Now it would be great if they could do everything, but I think that's a good start. Apple definitely has taken a lot of heat, a lot from myself, but like from other people about how Siri has fallen behind, you know, all of the other AI assistants out there, whether that's Amazon's Alexa, which has Alexa plus, or whether that's Google's assistant, both of those are, you know, miles ahead. And I think there's a lot of new things getting shipped on all of those every day. And Apple feels like it has been quite slow So I think even people that love Apple like myself have struggled to see this gap in basically what it's capable of doing or what's capable to do in the industry and what they're actually doing. I don't think this is, I definitely think they should be miles ahead of where they are. But hopefully the timing and the pressure of all of this is going to help kick Apple into gear and make them actually get this latest update of Siri out this year. Apple can iterate on Siri's product experience, safety, privacy constraints, and now they can kind of rely on Gemini for a lot of the core model improvements that move faster than Apple's internal cadence. Which is sad to say. I would like Apple to be more ahead, but I'll stop complaining. There's one other awkward part of this story and they've obviously made this amazing deal to work together, but there's a legal lawsuit going on. A federal Judge ruled in 2024 that Google illegally maintained a monopoly online search because they built these, you know, these default placement deals with big companies. So I mean essentially what they're talking about here is that Google pays Apple to be the default search engine on Siri and on all iPhones. And they pay them billions of dollars every single year, you know, tens of billions of dollars to that they've paid over the years to be the default search engine. And so, you know, there's this whole lawsuit going on. But it's interesting because, well, there's a lot going on but this could make some, this could basically have some big implications for what these kind of deals look like for AI and Google and Apple making these kind of similar feeling AI deals today, although I do think there are some differences. But in December of just last year, so just recently there was a judge, Amit Meta, who basically said that in, you know, to solve all of this kind of antitrust, all of these antitrust rules, Google would be essentially forced to limit these kinds of deals to a one year term. So they're forced, you know, an annual renegotiation rather than have these multi year lock ins. And I guess their idea is like, well you know, then Apple can better negotiate and get other players on. I just, I actually fail to think this is going to make that much of a difference. And I feel like the negotiation between Apple and Google, I mean the deal is still the deal. So whether it's a multi year or one year and I feel like that should be up to Apple and Google to make. But in any case a judge is saying that it's got to happen every year. I think in the context of that it's important because this Apple AI deal with Google means that even though it's technically, you know, not exclusive, it's kind of the same partnership that I think regulators are going to scrutinize from kind of a market shaping, the market shaping effects that it has. I think the question is not just whether Google is going to pay for the default placement, but whether Google becomes the default intelligence layer across billions of Apple devices. And you know, what is, you know, what, what does that mean in the, in the competitive landscape of AI platform. So I think this is an interesting thing to kind of think about from Google's perspective. This is distribution at nearly unmatched scale, right. Reuters notes that Apple has more than 2 billion active devices which is going to expand Google's reach way beyond Google's own apps and its existing device partnerships, which is fantastic for Google. I think it's also a narrative win, right. For the last year the public story has been that Apple is behind in AI. If Apple, the company which is, you know, the most famous for waiting technology is mature, is deciding that Gemini is the best available foundation model for its next assistant, I think that becomes kind of a third party validation for Google's model progress. I think if you look at the stock of Alphabet, Wall street has definitely noticed this. You know, after this announcement came out, Alphabet's market value and their stock price had, you know, increased. So these are things I think we should watch next. First of all is how Apple is going to route the requests if Siri defaults to Gemini. For most open ended questions, Apple's earlier kind of chat g handoff is going to become a secondary path and user experience. I think it's going to change really quickly. The second thing is how Apple enforces the privacy boundaries. Right. Apple is promising it's going to preserve its privacy posture. But I think the real test is going to be what data is sent off of the device, what is logged and how the system behaves under kind of this ambiguous or a lot of these kind of sensitive requests. The third thing I think that we want to look at is the regulatory reaction, right? With Google already under, you know, kind of getting looked at for some of these deals they've made with Apple in the past. I think any perception that Google becomes a new kind of default on iPhones is going to attract attention from courts and regulators even if the deal is structured differently than search. So I think Apple is betting that it can keep the AI Apple flavored through kind of their product design, the privacy architecture, while letting Google supply a really big part of a raw intelligence. I think for users, the hope is really simple that Siri will finally feel less like a like a voice remote from 2014 and more like an AI assistant that's built for 2026. Only time will tell.
