
AI services have earned a reputation as energy-hungry beasts. How worried should we be about them versus the other emissions in our digital lives?
Loading summary
A
The company behind ChatGPT has told its users that being polite to their AI chatbot is expensive, saying hello and please and thank you. It costs the company tens of millions of dollars in computing and energy bills. I don't really use ChatGPT that much, except sometimes for little things. Like this week when I asked it, can dogs eat kiwi? The answer, by the way, yes, but in moderation, I almost always say thank you. It somehow feels rude and wrong not to. I've heard so many things about how much energy ChatGPT devours, and hearing that announcement from them back in April, I've been wondering, is it worth it to get a quick answer to my stupid question? But then I also wonder, when the machines eventually take over, will ChatGPT remember that? @ least I was polite.
B
Well, I would never argue with getting on the good side of our robot overlords.
A
From the newsroom of the Washington Post, this is Post Reports. I'm Colby ekowitz. It's Monday, October 6th. Today we're talking about how AI chatbots earned a reputation as energy hungry beasts. How bad are they really? And how do all the other things we do online compare? The post climate coach Michael Coren joins us to break it down. Michael, thank you so much for joining us.
B
Happy to be here. Thank you.
A
So, Michael, you write a lot about how our digital lives online impact our carbon footprint. So kind of set the table for me how these AI tools like ChatGPT fit into this larger conversation around the digital impact on the environment.
B
Yeah, a lot of people have been worried about it, and I think for some good reason, the early estimates on how much energy and water these AI models use have been pretty significant. So reportedly about 10 times more electricity per query than a basic Google search. And one image is equivalent to charging your smartphone. And you're using liters of water for just a conversation. At least that's the original estimates. And so I was really curious if that's still true because there was a panic a few years ago about Netflix and that every time you watched a half hour show it was like driving four miles. And that turned out to be not true. Those estimates were quite often by a factor of 25 to 50, and so I wanted to look into that.
A
So, Michael, I'm going to ask you to really, like, dumb things down for me because I've always struggled to understand exactly how the Internet is using energy. Like, how does asking ChatGPT a question cost energy?
B
Sure, the Internet consumes energy because every question you ask requires electricity. It's that Simple. So all these questions are being processed by GPUs. These are specialized chips that are running algorithms and massive data centers. And then once you type something on your screen, that message gets sent over vast distances, it gets processed in these data centers, and then it gets sent back. And sort of every step of that process is going to consume electricity, which obviously requires power plants to run, and then they need to be cooled as well, and that requires water. So you might see a single data center alone consume the equivalent of thousands of households of electricity per year.
A
Wow. Okay, so then how do you begin to calculate this question of, like, how much energy is used per question on ChatGPT?
B
So researchers have basically been lining up different AI models, and then they pepper them with questions, and then they measure how much power is consumed by the GPUs by these chips, and they have to make some extrapolations. That's not a perfect science, but basically, basically they understand more or less how much energy is required for how much runtime per chip, and then they make some sort of informed guesses. And over the last few years, it appears that they've been directionally correct as their measures and as independent analysts, as well as the ones issued by the companies sort of line up. So, for example, AI research firm EPIC AI, they estimated that a typical AI query now consumes about 0.3 watt hours. So that's enough to power a standard LED bulb for about two minutes. And when I wrote Google, they confirmed by email that actually their median text response by its AI tool, Gemini, is around slightly less, around 0.24 watt hours.
A
And so why does it take so much energy? Like, we had a post article describe it last year. You know, getting ChatGPT to write an email would be like wasting a bottle of water. So what is it about AI that's that's taking up so much energy?
B
So unlike maybe a search query which looks at the Internet as a fixed data set and then finds your, you know, the appropriate and then returns the link or set of links, this is actually digested the Internet and then built a neural network around it. So it's almost like an artificial brain. And when you're asking a query, it's not just finding the right link, it's actually functioning very similar to your brain. And that requires a lot of energy. Your brain, I think, requires about 20% of the energy consumption of your body. And so similarly, you know, an AI bot is going to require a lot more energy than just a simple text search.
A
That's so fascinating. So you mentioned Gemini which is Google's AI and how much energy they use. So are all AI models equally guzzling the same amount of energy?
B
Yeah. So not all AI models are equal at all. You can choose between bigger models that use a ton of computing power and were assembled with massive training data sets, which means they sucked in the entire Internet and others that are much slimmer. So DeepSeq was released by a Chinese firm earlier here that was using a fraction of the power of some of the big models and performing similarly. And then there's now what they call small language models. So these are relatively tiny models that are easy to run on your smartphone and they're useful for very specific tasks. And it seems like what we're going to see is that these massive models with billions of parameters, what they sort of the akin to neurons in your brain, are amazing for the scope and breadth of what they can do, but not necessarily more useful for some of the specific tasks. And so we may move to a world where the small language models do more of the specialized work and we find other uses for the large language models. Instead of asking about if your dog can eat kiwis.
A
Yeah, probably not a good use of energy. So are these like larger language models, are they looking for ways to become more energy efficient?
B
Absolutely. So you could think of the training process for these models which requires hundreds of megawatt hours. It's like running a full size power plant for days and days just to train the model as waste. Every dollar they spend on energy is a dollar they can't recoup. And so they're very rapidly trying to reduce the energy consumption. I spoke with Google not that long ago and they said they were seeing tenfold increases in efficiency just in their early sort of days. And that's not surprising. You see this in a lot of sort of digital technology. So they're working on multiple fronts. They're trying to change the timing of when queries are run or when the training models are run, that can reduce em, it can become more efficient for the grid. They're just looking at smaller and more efficient models and just trying a bunch of different things that are both operational and then the actual relate to the models themselves.
A
But tell me if I'm wrong here, but as efficiency improves, do we then start consuming more, not less? Does it offset? In some ways, yes.
B
So the Jevons paradox is a paradox that was first observed during the Industrial revolution in England. And basically what it found was as coal burning steam engines became more efficient, we didn't use less coal, we used more coal. And the reason was that as the price falls for that service or that energy, you're actually able to use more of it productively. And in this case, we may see something similar. So as we get better and better in models, we're going to see more and more applications for them and use more and more energy. Now the question is, can the efficiency improvements outpace the application? That's an open question because many of these companies do not release a lot of data around exactly how much all these models use. Outside researchers have to estimate it. So we don't know for sure whether we're seeing that paradox right now.
A
It seems like people are mostly using AI for these chatbot experiences, but how else do you think it's going to seep into other aspects of our lives?
B
Well, we will probably see AI not just in your chatbot field, but it'll become integrated into almost every everything that you have that's digital. So your phone, your car, if you're making a customer service call, your stove, your, your dishwasher may talk to the energy grid and decide, oh, it's time for me to run or not. There's so many applications that will just be a sense. It is, that is hopefully, or maybe hopefully not the future. Because if it becomes so cheap that essentially it's ubiquitous or free at the incremental level, it just may become something that isn't everything.
A
Okay, but let's say, you know, this happens. AI has taken over every aspect of our lives. Like, what is the worst case scenario here when it comes to AI energy consumption? Like what, what could happen that we should be worried about?
B
So I think the worst case scenario is that we keep building out these data center infrastructure and these models with no regard to how to make them more efficient long term and how the power consumption is going to affect everyone else. So we're already seeing data cent around the world, basically reduce the accessibility of fresh water, destabilize the grid, actually increase utility rates in some places just because these require so much electricity. And ratepayers are often the ones stuck paying for that infrastructure in the long run. And so I think if you don't do this well, and we think far ahead about not only how to be more efficient, but how to site and power all of these data centers, we're going to end up with a lot of collateral damage for people who really were not consulted on this and are going to have to pay the price.
A
Well, let's take a break there. And after the break, I want to chat with you about the other emissions in our digital lives and how we should be managing them. We'll be right back.
C
To First Responders Every second counts Firefighters, paramedics, law enforcement officers, and 911 operators need to communicate instantly. And that's why FirstNet exists. FirstNet is the only network built with and for first responders. It's a unique public private partnership with the FirstNet authority making sure America's first responders always have priority access, whether in the heart of a big city, a rural county, or a remote territory. During natural disasters, major storms, or large scale events, first responders know they can count on FirstNet to keep them connected even when other networks are congested or unavailable. FirstNet is more than a network. It's a commitment to first responders, a promise that America's public safety always comes first. FirstNet built with AT&T. Learn more@firstnet.com PublicSafety first.
D
Think about why you listen to podcasts. It's like having a friend who makes you think or can help you wind down right? Well, the Washington Post has a lot of people you can turn to at any hour. You can read the most important and interesting stories. We can help you cook something delicious, give you advice on a tricky friendship. Rave about a movie or book that you shouldn't miss. When you become a Washington Post subscriber, do you have a companion for whatever part of your day needs it most? Get it all for just $4 every four weeks. That's for an entire year. After that, it's just $12 every four weeks. Cancel anytime. Go to washingtonpost.com subscribe. That's washingtonpost.com subscribe.
A
So I'm wondering, with AI, it is everywhere. Even if you don't realize you're using it, you might be passively using it. So should we be worried about our use of AI?
B
Well, it's still really early days and I think while AI consumes more energy in the basic search and the two have begun to merge, there have been such incredible gains, such a short time, and they're still accelerating. So on the most basic level, AI algorithms today consume less than 1% of the energy that they required in 2008. And if you think about that plus how fast things are improving, I think it's just too early to say where we're going end up. But I do know right now AI remains a really tiny part of our digital footprint.
A
Okay, so that's interesting. So do we have a sense of how much it's using? Like AI is still small when it Compared to things like what? Email, Google?
B
Well, yes. So compared to other digital activities, it's a little bit more. But if you compare it to a lot of the other things we do in our life, it really doesn't show up. So for example, TV viewing, that's more than 100 times more energy for the average American than the eight or so standard search queries they do or AI image queries they do. And then Internet use with your computer, because you have a screen on and you're using the Internet, you're also consuming quite a bit more energy. Digital storage, video streaming, those are all going to be a bigger digital impact than asking a few text questions. I will say if you're using AI to create long form video, that's a different story. But for most people it just doesn't show up.
A
I'm curious then, Michael. We've heard so much about how AI is draining our energy, but I don't hear that a lot when it comes to rotting on my couch and watching eight episodes of a TV show.
B
Yeah, I think that we are very focused on what's new and the fact that it is true that AI in aggregate is going to use a lot of electricity. So it's expected to consume about 8% of the total electricity in the United States by 2030. That's compared to 3% today. But the driver of that is not you asking very simple text questions. And so, sure, think about it, but don't worry about that piece of it.
A
Because there's other things that we do in our life, like watching TV that are taking up more energy, of course.
B
But beyond tv, there are things that, for example, your commute. There is nothing that you can do with AI short of maybe re recording all of the greatest movies of the last 20th century. That compares to the average US commute. You would have to search queries for about several thousand years to match the emissions it takes for the average American to get to and from work every year. And so I kind of gets back to this idea that there are three things that, you know, really you should focus on. And it's, you know, what you eat, how you move around, and how you heat and cool your home. Those are where you get the biggest bang for your buck when you're thinking about your own life.
A
So what shouldn't I be eating?
B
Well, you know, if you had to cut one thing out, the hamburger is usually the one we go to. Just because cattle use so much water. 660 gallons for the average burger compared to 0.1 for even a thousand ChatGPT responses, not to mention all the energy and methane and emissions that come from them. So, you know, if we had to pick anything, that would be an easy one.
A
I will say this might not be very relatable, but I actually don't eat red meat. So I guess that means I can use all the AI.
B
I want you to go right ahead. Now it's about those kiwis.
A
So back to the AI chat box. What are just some simple rules that our listeners can follow if they want to use it more responsibly?
B
Sure. So I think you're better off using either a search engine or some of the simple AI models for your simple questions. That is totally, I think, going to be the standard and you won't even be able to avoid it. I think when you ask Google now, you're basically getting AI response. The world is going to move the way it did to the search field, to the sort of AI chatbot field. You'll be able to choose models and you'll be able to choose companies, I think that do this the most responsibly. We're still too early to say who those are and how. Exactly. Exactly. But I think it'll become very clear over time. For now, a lot of these companies have sort of blown up their emissions targets because they're just scrambling to get electricity. But I think we're going to see those return and we'll very quickly see much more efficient models and hopefully clearer data on what matters and what doesn't.
A
Well, Michael, I'd love to have more conversations with you about this, but until then, if I have any more questions, I'm just going to ask my ChatGPT, but I'll be very, very polite.
B
Great. Well, glad to hear that. Good luck.
A
Thanks so much for coming on.
B
Thank you. Glad to be here.
A
Michael Coren is the climate coach advice columnist for the Post. That's it for Post Reports. Thanks for listening. If you're looking for the latest updates on the big news of the day, check out our morning news briefing, the Seven. We bring you the seven stories you need to know about every Weekday morning by 7am you can listen to it wherever you listen to podcasts. Today's show was produced by Renny Siernofsky. It was edited by Ted Muldoon and Rena Flores and mixed by Sean Carter. Thanks to editor Marisa Bellack. I'm Colby Ekowitz. We'll be back tomorrow with more stories from the Washington Post.
E
You listen because you know the power of good journalism and the Washington Post is there for you 24 7. When you become a Washington Post subscriber, you get exclusive reporting you can't find anywhere else. You also get sharp advice, columns, delicious recipes, TV and music reviews and so much more. Right now, you can get all of that for just $4 every four weeks. That's for an entire year. After that, it's just $12 every four weeks. And you can cancel anytime. Add to your knowledge and discover all the Post has to offer. Go to washingtonpost.com subscribe that's washingtonpost.com subscribe.
Date: October 6, 2025
Host: Colby Ekowitz
Guest: Michael Coren, Climate Coach at The Washington Post
This episode explores the environmental impact of AI chatbots like ChatGPT, particularly their energy and water consumption. Colby Ekowitz speaks with Michael Coren, The Post’s climate coach, to unpack how much energy these tools really use, how their environmental footprint stacks up against other everyday digital activities, and what meaningful steps individuals can take to reduce their own digital carbon footprint.
While AI chatbots like ChatGPT are more energy-intensive than basic search or email, their real-world environmental impact currently pales in comparison to activities like TV viewing, commuting, or eating a hamburger. As AI efficiency improves, its applications will grow, and its share of total electricity usage will rise. But for now, Michael Coren emphasizes focusing on life’s bigger contributors to climate impact — our diets, travel habits, and energy use at home — rather than sweating over every ChatGPT question.
For listeners who want actionable takeaways: