Transcript
A (0:00)
Google, in a recent all hands meeting, said that they have to double their AI compute every six months in order to meet demand. There is an absolutely massive, insatiable demand for Google's AI features. And we just had Gemini 3.0come out, so we saw a big spike in that. They just came out with their latest version of Nano Banana Pro. So we saw a huge spike in, you know, AI image generation demand. There's so much more coming out of Google. We're going to break it all down on the show today. We're going to where we see them today and where I see them in the future and some of their predictions that they're making over the next four to five years. It is absolutely incredible. Let's get into all of it. Before we do, I wanted to mention if you want to try any of the of the AI models that I talk about on the show, Gemini included, there's Claude, there's Grok, there's OpenAI, there's everything from 11 labs for audio and tons of interesting image generators. Go check out AI Box AI. That's my own startup where I let you test all of the different AI models and also compare them side by side. So it's 20 bucks a month and you get access to all of the different tools all in one place. And I hope it saves you a ton of money on subscriptions, but also just makes your life easier not having to log into tons of accounts and juggle lots of passwords. So go check it out. There's a link in the description AI Box AI. All right, let's get into what's going on at Google. So at a recent all hands meeting, Google's head of AI infrastructure, that's Amin, the had said that Google right now has to race to build out their compute capacity in order to meet all of the demand that they have. He was given a presentation. He also included a slide in this presentation that said, quote, now we must double every six months the next 1000x in four to five years. We then had Google Sundar Pichai, who spoke at the same meeting and told all of the employees he was answering all of their questions about a quote, unquote, potential AI bubble burst and about capital expenditures. So we're going to get into all of that today. The, the one thing that I thought was really interesting here that the hat said is that he said, quote, the competition in AI infrastructure is the most critical and also the most expensive part of the AI race. He had this whole presentation to everyone. It was coming just a week after Alphabet had their expected their third quarter results and they, they beat expectations on that. They raised their capital expenditure forecast for time this year to about 91, from about 91 billion to $93 billion. And they also had, you know, a significant increase in 2026 that they're planning for. So of course we have other players that you, you'd call them hyperscalers as well, which are Microsoft and Amazon and Meta. All of them have also boosted their capex guidance. And the four companies now expect to collectively spend more than $380 billion this year, which is absolutely insane. So Google's job is going to be building infrastructure, but it's not just to try to outspend the competition. Like that isn't the goal. Although a lot of people see this as this weird race, right? I mean, I think we even get this from like comments when you had Donald Trump with all of the top tech executives around a big roundtable and he was like, how much are you spending? Like Tim Cook, how much are you spending? Mark Zuckerberg? And Mark Zuckerberg is like, oh, probably like 600 billion in the next few years. And everyone's like, oh, like 100 billion or 600 billion. I don't know, it was like this weird, you know, group of people all saying how much money they're playing on spending. And like, I get this sentiment which is like, oh, we're all spending inside of America. It's going to generate jobs, it's going to be good for the stock market, it's good for the economy. Like I, I get the vibe there. It's just, it's just felt like the most absurd conversation I'd ever, I'd ever seen. I mean, overall I'm bullish on, on America and I, I, I like the vibe, but it's, it's kind of funny. So the had was kind of outlining, like, look, we're not just trying to outspend anyone. He said we're going to spend a lot. He also said that the real goal that they're trying to actually provide is to, quote, more reliable, more performant and more scalable than what's available anywhere else. So I think in addition to all of these different infrastructure buildouts, Google is also going to try to increase the capacity to create more efficient models. They have custom silicon so they're able to do that. I think a lot of times we talk about, oh, we need to build more infrastructure because know, our, the AI model needs so much compute, but if you can actually make your model more efficient, then you can use less compute. I feel like this is, it's a, it's a tricky place because you can either focus time on making your model better or make it more performant. And there's kind of two different paths. Sometimes you can do them at the same time. But like we tend to see that the better the AI model is, the more compute you gave it. In OpenAI's case they were like look, if we give you know, GPT it was back when they had 4o I believe. But they're like if we give 4o like $1,000 or $10,000 of compute to answer a question, it just does it way better. And it, it kind of got to this tricky place where it was basically like okay, well however much compute we can get our hands on, our model will just get that much better. And of course they are also training better models like GPT5 and, and the models are like brand new models are getting better and better. But, but that being said, there's kind of this thing at the back of everyone's mind which is if you have more compute and you could spend $10,000 for every query, your, your answer would be a lot better. And so anyways it's, it's a tricky position to, to be in. Obviously the economics of it wouldn't make any sense but when you're trying to make a better model and make it look like you are better than everyone else, you tend to spend a lot and, and could over index in that area. So Last week Google announced the public launch. They have a seventh generation, it's a TPU or a Tensor Processing unit. They, they're calling it Ironwood. The company says that it is 30 times more powerful or sorry more power efficient than its first cloud TPU that they pulled out in 2018. And they also said the company has a really big advantage because they have Google DeepMind who you know, it seems like the most obvious acquisition of all time now but they acquired DeepMind many years ago out of London and now this is basically the head of all of their AI. And it really has a lot of interesting research coming out of there about what models are going to look like in the future. So I think what's interesting is that Google right now needs to be able to deliver a thousand times more capability compute and storage networking for essentially the same cost and increasingly the same power and same energy level. That's what Vaheet recently said. He said it won't be easy but through collaboration and co design we're going to get there. Think about that for a second. Everyone a thousand times more capability, compute and storage network for the same cost. So they really have to focus on not just making their models better, but Google because of scales of economy, how many users they have. And users are used to using things like Gmail and YouTube and Google search for free. I mean, there's obviously ads in there that that funnel and fuel it all. But because everyone's used to using this for free, when Gemini comes out, when it's just like embedded into every product, it's like, wow, like Gmail and YouTube and Google are so much better now that Gemini is helping answer the questions. But there is an associated cost with all of those things. And if they want to scale up a thousand times, they have to figure out how to do that without increasing energy usage and without increasing cost, which is an absolutely massive problem. Sundar was talking to employees and he said that next year is going to be very, quote, intense. He said that the AI competition and pressure for cloud and compute demand is going to be really, really intense. He answered a whole bunch of questions about the AI bubble. Right? Everyone's been talking about how the AI bubble is going to pop and there's not going to be a lot of money left to build out all of these data centers that are already underway. Silicon Valley, Wall street, everyone is talking about this AI bubble. We had a recent event which was that Nvidia came out with their Q3 earnings and they beat expectations and they had an insane, you know, some insane earnings numbers. And they were like, look guys, I know you guys are all talking about an AI bubble, but we selling more and more chips, so like, we're not seeing it from here. Someone said, someone was reading a question and asking Sundar about this though, and they said amid significant AI investment and market talk of a potential AI burst, how are you thinking about ensuring long term sustainability and profitability if the AI market doesn't mature as expected. Sundar responded to that. And also I will say it's not like, it's not like this is like an open mic. Anyone asks a question like someone's reading like sort of a prepared statement and Sundar sort of had a prepared answer. So, I mean, you know, it is what it is basically the communications department and talking to us. But this is what Sundar said. He said it's a great question. It's been definitely in the zeitgeist. People are talking about it. He essentially just reiterated a point that he's made a lot in the past, which is the risk of not Investing aggressively enough. I think everyone knows that if Google doesn't Invest very aggressively, OpenAI will essentially replace Google Search. Like people are already making a massive shift towards that in that direction. And I think Google's focus on a lot of these AI features and having the AI snippets at the top has helped them not become completely irrelevant and is keeping them from losing more market share. Because 800 million weekly active users on ChatGPT, those users and that usage is coming from somewhere and Google is a massive part of that. So they've had to be quite aggressive in how they approach the problem in order to make sure that they are not getting eaten alive. Essentially the point that he made is that Google's cloud business just had a 34% annual revenue growth, which is more than $15 billion in the quarter and its backlog reached $155 billion. So he said, quote I think it's always difficult during these moments because the risk of underinvesting is pretty high. I actually think of how extraordinary the cloud numbers are. Those numbers would have been much better if we had more compute. That is an also a very interesting point, which is that they actually could have made a lot more money if they had more compute. They could have sold it. There was the demand for more compute, but they, they didn't have the availability. So it just went to other players. And you have tons of these like data center compute companies that are popping up, raising billions of dollars. A lot of them, maybe they go and acquire like an old crypto mining startup that had a whole bunch of GPUs and now they just sell them or they rent them off to AI companies. Like there's a whole bunch of areas where people are making billions of dollars that technically if Sundar had all that compute over at Google, he could have been making that money. But they're just like Google ran out of compute, so they had to go to other sources. Even sort of like some random startups that aren't quite as big or maybe as integrated, but people just need the compute. So he said that the company right now is following a really disciplined approach. They're trying to strengthen their underlying businesses and their balance sheet. He said, quote we are better positioned to withstand misses than other companies. Which is interesting. May have been pointing at Meta, who seems to be in a tricky place. It doesn't feel like their AI is really becoming a market leader. Zuckerberg has spent an insane amount of money trying to hire top researchers. It's just really not taking traction. I don't know a lot of people that just use meta AI of their own will for for because they like it. So I think there's going to be ups and downs on the market for sure. Sundar says it's very, it's a very competitive moment, so you can't rest on your laurels. We have a lot of hard work ahead again, but I think we're positioned through the moment. So obviously a lot is going on right now and Google is really focusing on not getting left behind. I think this is the right move for them. We keep seeing incredible models and the usage is, is, you know, quite impressive. How many people are actually using Gemini and using Nano Banana and all of their, all of their different AI models and how they're getting plugged into all of the different services. I think Google has taken the absolute right approach. Now. I think we just watch and see if they can keep up all of this momentum. Thank you so much for tuning into the podcast today. Make sure to go check out AI Box AI if you want to try out all of the models I'm talking about on the show today. For $20 a month, it's AI Box AI. There's a link in the description and also make sure to leave a rating and review wherever you get your podcasts. If you're on Spotify, it's the about tab. Otherwise on Apple you can scroll down, drop some stars. I really appreciate it and it helps the show out a ton, so hope you have a great rest of your day.
