A (5:56)
In the most efficient way that we can. So if, like me, you're thinking that, of course Google has an incentive to make it seem like they are not using that much power or water. I think that again is a healthy degree of skepticism to have towards a massive corporation. Right, so let's dig into their report and let's look at, let's look at some numbers here. So I'm reading right now from MIT Technology Review to start us off, they said Google has just released a technical report detailing how much energy its Gemini apps use. For each query in total, the median prompt, one that falls in the middle range of energy demand, consumes 0.24 watt hours of electricity, which is the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini. So a caveat right out of the gate is that we are talking only about text prompts. We're not talking about images or videos. We know those use more power. Just talking about text prompts, just talking about Google, Gemini too. We don't know about OpenAI XAI and all the others. So with that out of the way, they looked at not only, and they mentioned this in the video, not only like the theoretical maximum utilization of all the chips they have, which is kind of how we've been doing the estimates now. They looked at the energy demand that's actually being used by the chips. They also looked at all of the supporting hardware. So I'll hop over here on screen to Google's page about this. They took into account the full system dynamic power, so not just the energy and water used by the primary AI model during active computation, but also the actual achieved chip utilization at production scale, which can be lower than the theoretical maximum. They also took into account idle machines. You know, they have to have a bunch of systems that are offline. Not offline, but just sitting in the wings waiting and Ready to go so that their system can be reliable as demand increases and decreases. So they took into account the energy use, water use, power use, all of that, of these idle machines. They also didn't base it just off of GPUs. They looked at the CPU and RAM usage as well. Because AI model execution doesn't happen only on the GPU or Google's TPUs. The CPU and RAM also play a crucial role and use energy. And again, they also looked at the data center overhead, all of the stuff to support it, cooling systems, power distribution. That other, that other overhead is wrapped up in a metric called power usage effectiveness. And of course, they also looked at water consumption. And again that, that meme that has been persistent and I still see it to this day, and people bandied about as though it's true that a single ChatGPT query dumps out an entire cup of water. What Google has found was that a single Gemini text prompt, the average text prompt, uses approximately five drops of water. I've jumped back over to MIT Technology Review. They said that Google's custom TPUs, their proprietary equivalent of GPUs, account for just 58% of the total electricity demand. Another large portion of the energy is used by the equipment needed to support that hardware. So the CPU and memory account for another 25%. The backup equipment ends up taking about 10% of the total. That's those idle machines. And then the final 8% is overhead with the data center. So that's how it kind of breaks down. Later in the article, they talk about how Google has made a lot of purchases. They've signed agreements to buy over 22 gigawatts of power from renewable sources, including solar, wind, geothermal, and advanced nuclear projects. And because of those, Google's emissions per unit of electricity on paper are on average one third of those compared to the average grid use. So Google does start out with an advantage here on the power usage front in terms of their investing in clean energy. Unlike where we've heard about other places spinning up like fossil fuel plants to power AI, which is obviously not good, so bear that in mind too. Google estimates that each prompt consumes 0.26 milliliters of water, or about five drops. They go on with a quote here saying people are using AI tools for all kinds of things, and they shouldn't have major concerns about the energy usage or water usage of Gemini models, because in our actual measurements, what we were able to show was that it's actually equivalent to things that you do without even thinking about it on A daily basis, like watching a few seconds of TV or consuming five drops of water or using the microwave. And I think that's really sort of the crucial thing that has bugged me in most of the discourse around AI power use. Everything you do that is electrical uses power. And you don't tend to think about it unless you're trying to like make some kind of point. So as an example, I've had this power meter plugged into my computer for like over a year. And I just check on it periodically while I do various tasks to see how many watts my computer is using instantaneously. Doing different things. Running Fortnite while I have Fortnite running uses more power than when I generate an image on my machine locally, for instance. So like all of those simplistic things about how much power and water they use, never pass the smell test. For me, this seems to be a bit more realistic. And again, I really want to see the numbers from other companies because again, Google's got that advantage with their investment in cleaner energy and things like that. Their breakdown goes into more of the things they do to help optimize their energy use and things like that. So if you want to read deeper into it, you can. They also put out a research paper. It's only 10 pages long. I have it printed out that I'm going to go read this evening because again, this came out just a few hours before the show, so I haven't had time to dig that deeply. But as I was, I did skim through this research paper a bit and specifically I was looking for the parts about water. Cause I wanted to see. Google has this thing where they pledged to return 120% of the water they use. And I wanted to see if they were doing something sneaky similar to. It's not really sneaky, but similar to the electricity where they're buying a lot of renewables to offset the average grid dirtiness wherever they're operating. Right. I wanted to see if they were doing that with water because of their water commitment. But actually in the paper they say that they only count their consumption. So I'm. I'm feeling more inclined to believe what they've put out here. I feel like this data isn't super manipulated. I feel like this is actually good data. But again, I am not like an authoritative source on that. Other people who are much smarter and much more like narrow focused on this stuff I hope will come out with more detailed analyses. But from my own sort of reading through it, it does pass the smell test as Far as like not seeming like something they did just for public relations. It seems like there's good information in here and it's also, it also makes an intuitive kind of sense. Not that that is something you should really go on, you know, when it comes to science and data. But like it wouldn't make sense if a single GPT query or Gemini query took a whole glass of water. That just would not be feasible at scale. And five drops for an individual person doesn't sound like a lot, but remember, how many queries are they processing right? So these things still do add up. And then even in all of this, in these power discussions, data centers make up a single digit percentage of energy usage. I wanted to have it ready to quote, but I don't. But I think it's, it's less than 8%, it's somewhere around there and it's projected to go up. But a lot of that is speculation on the rate of change, of the rate of change, the second order derivative or whatever. So I think this is great in terms of giving us more data points, more realistic data points about how much power these systems use because it's very important that we don't just destroy the climate. Obviously I'm very passionate about that, but I have seen the segment timer tick down to zero. So I'm going to let this one go. I'm going to link everything in the show notes in the description for you. You should go check it out. But AI doesn't seem to be the energy sucking monster that everyone wants to paint it as. However, there are things like the scaling race and cramming AI calls into things that don't need it, like every single Google search or your pizza app generating a picture of your pizza before you order it. So this discussion around power usage really needs to be more nuanced. Ooh, I knew I said it was going to end it, but okay, here's my last thought. Whenever you choose to do something, you have to understand the cost associated with it. And if you think that is worth it. So therefore, if an AI is just going to generate a pizza for you that you didn't ask for because you're ordering a pizza, that is a complete waste of power. You didn't want it to do that. But if you are using an AI to help you do something, it's doing that thing for you and using power. That's pretty much the same as playing a video game like Fortnite or rendering an image in Blender or doing anything that uses power, right? So it's all about that sort of trade off. And also maybe I'm in a bubble where I just see a lot of these memes that are like, AI is going to burn us all down anyway. I'm going to cut myself off before I just start ranting. We're going to move into our next segment right after this quick break and we're talking about hierarchical learning models, a different kind of architecture that's really cool and extremely good at reasoning through tough problems. Stay right here. This episode is brought to you by Indeed. When your computer breaks, you don't wait for it to magically start working again. You fix the problem. So why wait to hire the people your company desperately needs? Use Indeed Sponsored Jobs to hire top talent fast. And even better, you only pay for results. There's no need to wait. Speed up your hiring with a $75 sponsored job credit@ Indeed.com podcast terms and conditions apply. Limu Emu and Doug Here we have the Limu Emu in its natural habitat, helping people customize their car insurance and save hundreds with Liberty Mutual. Fascinating. It's accompanied by his natural ally, Doug. Uh, Limu is that guy with the binoculars watching us.