Podcast Summary: Post Reports
Episode: "Talking to ChatGPT drains energy. These other things are worse."
Date: October 6, 2025
Host: Colby Ekowitz
Guest: Michael Coren, Climate Coach at The Washington Post
Episode Overview
This episode explores the environmental impact of AI chatbots like ChatGPT, particularly their energy and water consumption. Colby Ekowitz speaks with Michael Coren, The Post’s climate coach, to unpack how much energy these tools really use, how their environmental footprint stacks up against other everyday digital activities, and what meaningful steps individuals can take to reduce their own digital carbon footprint.
Key Discussion Points & Insights
1. Why Are People Worried About AI’s Energy Use?
- AI chatbots require more energy than basic internet functions.
- “Early estimates on how much energy and water these AI models use have been pretty significant. So reportedly about 10 times more electricity per query than a basic Google search.”
— Michael Coren [01:47]
- “Early estimates on how much energy and water these AI models use have been pretty significant. So reportedly about 10 times more electricity per query than a basic Google search.”
- Panic over energy use in digital life isn’t new — previous fears about streaming (e.g., Netflix) were often exaggerated.
2. How Does ChatGPT (and AI) Use Energy?
- Every question to ChatGPT runs on energy-intensive data centers and chips (GPUs).
- “All these questions are being processed by GPUs...running algorithms in massive data centers...every step ...consumes electricity, which obviously requires power plants to run, and then they need to be cooled as well, and that requires water."
— Michael Coren [02:43]
- “All these questions are being processed by GPUs...running algorithms in massive data centers...every step ...consumes electricity, which obviously requires power plants to run, and then they need to be cooled as well, and that requires water."
- Data models have to be trained (very energy-intensive) and then run queries (also energy-consuming, but less than training).
- Example: One image query is roughly equal to charging your smartphone, and even a simple text question uses multiple times more energy than a regular web search.
3. Quantifying AI Energy Consumption
- Typical AI query now: About 0.3 watt hours (~enough to power an LED bulb for 2 minutes).
- Google’s Gemini is slightly lower at 0.24 watt hours/query.
— Michael Coren [03:32]
- Google’s Gemini is slightly lower at 0.24 watt hours/query.
- Not all models are equal:
- Newer, smaller models (ex: DeepSeq, small language models) use much less energy while performing specialized tasks.
4. What Makes AI So Energy-Intensive?
- AI models process data much more like a brain, not just searching for information but generating language and ideas from training on the whole internet.
- “When you’re asking a query, it’s not just finding the right link, it’s actually functioning very similar to your brain. And that requires a lot of energy.”
— Michael Coren [04:40]
- “When you’re asking a query, it’s not just finding the right link, it’s actually functioning very similar to your brain. And that requires a lot of energy.”
- Training AI models is like running a full-size power plant for days for one model.
5. The Push for Energy Efficiency
- Companies are highly motivated to improve efficiency due to direct financial costs.
- Google reported 10x increase in efficiency in some models in a short period.
— Michael Coren [06:41]
- Google reported 10x increase in efficiency in some models in a short period.
- Strategies include changing query timing, developing smaller/more efficient models, and operational tweaks.
- But there’s a catch: As efficiency increases, use might increase (Jevons Paradox).
- “As we get better and better in models, we’re going to see more and more applications for them and use more and more energy.”
— Michael Coren [07:48]
- “As we get better and better in models, we’re going to see more and more applications for them and use more and more energy.”
6. AI’s Future — Where Else Will It Show Up?
- AI will move beyond chatbots; future applications: cars, customer service, smart appliances, and more.
- “It just may become something that isn’t everything.”
— Michael Coren [08:48]
- “It just may become something that isn’t everything.”
7. Worst-Case Scenarios
- Unchecked AI/data center growth can strain power grids, use up local water supplies, and increase utility bills for everyone.
- “...we’re already seeing data centers around the world basically reduce the accessibility of fresh water, destabilize the grid, actually increase utility rates in some places...”
— Michael Coren [09:43]
- “...we’re already seeing data centers around the world basically reduce the accessibility of fresh water, destabilize the grid, actually increase utility rates in some places...”
8. How Does AI Compare to Other Digital & Everyday Life Activities?
- AI’s share of personal energy use is currently small.
- “TV viewing, that’s more than 100 times more energy for the average American than the eight or so standard search queries ... or AI image queries they do.”
— Michael Coren [13:32]
- “TV viewing, that’s more than 100 times more energy for the average American than the eight or so standard search queries ... or AI image queries they do.”
- Commuting, diet, home heating/cooling all have far bigger impacts than using AI chatbots.
- “You would have to search queries for about several thousand years to match the emissions it takes for the average American to get to and from work every year.”
— Michael Coren [15:07] - The “big three” to focus on: What you eat, how you move, how you heat/cool your home.
[15:49]
- “You would have to search queries for about several thousand years to match the emissions it takes for the average American to get to and from work every year.”
9. Practical Advice for Responsible AI Use
- Use AI for tasks where it’s the best fit; use regular search or simple models for easy questions.
- “You’re better off using either a search engine or some of the simple AI models for your simple questions.”
— Michael Coren [16:36]
- “You’re better off using either a search engine or some of the simple AI models for your simple questions.”
- In time, consumers may be able to choose the most energy-efficient AI tools, but the landscape is still evolving.
Notable Quotes & Memorable Moments
- “Saying hello and please and thank you ... costs the company tens of millions of dollars in computing and energy bills.”
— Colby Ekowitz [00:02] - “It somehow feels rude and wrong not to. ... When the machines eventually take over, will ChatGPT remember that at least I was polite?”
— Colby Ekowitz, on manners with bots [00:30] - “I would never argue with getting on the good side of our robot overlords.”
— Michael Coren (lighthearted) [00:53] - “Every dollar they spend on energy is a dollar they can’t recoup ... they are very rapidly trying to reduce the energy consumption.”
— Michael Coren [06:41] - “There is nothing that you can do with AI short of maybe re-recording all of the greatest movies of the last 20th century that compares to the average US commute.”
— Michael Coren [15:07] - “If you had to cut one thing out, the hamburger is usually the one we go to. ... 660 gallons for the average burger compared to 0.1 for even a thousand ChatGPT responses.”
— Michael Coren [15:52] - “I actually don’t eat red meat. So I guess that means I can use all the AI.”
— Colby Ekowitz [16:14]
Timestamps for Important Segments
- 00:02 — Introduction: Manners with ChatGPT and energy usage
- 01:47 — Why AI models use so much energy
- 02:43 — How the internet (and AI specifically) consumes power
- 03:32 — Measuring energy per AI question/query
- 04:40 — Why AI’s architecture is energy-hungry
- 06:41 — Efforts and motivations to improve AI energy efficiency
- 07:48 — Jevons Paradox and possible rebound effects
- 09:43 — Risks of unchecked AI/data center expansion
- 13:32 — AI energy use compared to TV, internet, and other habits
- 15:07 — Emissions from daily commuting vs. digital activities
- 15:52 — The environmental impact of hamburgers vs. AI queries
- 16:14 — Colby’s personal relief: No red meat, bring on the AI
- 16:36 — Everyday rules for responsible AI use
Conclusion
While AI chatbots like ChatGPT are more energy-intensive than basic search or email, their real-world environmental impact currently pales in comparison to activities like TV viewing, commuting, or eating a hamburger. As AI efficiency improves, its applications will grow, and its share of total electricity usage will rise. But for now, Michael Coren emphasizes focusing on life’s bigger contributors to climate impact — our diets, travel habits, and energy use at home — rather than sweating over every ChatGPT question.
For listeners who want actionable takeaways:
- Don’t stress over occasional AI chatbot queries; their footprint is small.
- For most environmental impact, look at what you eat, how you travel, and how you heat or cool your home.
- Keep an eye on AI developments; more efficient and environmentally responsible models are on their way.
