Short Wave: Could AI Go Green?
Released May 9, 2025 | Host: Emily Kwong and Regina Barber | NPR
Introduction
In the episode titled "Could AI Go Green?" from NPR's Short Wave, hosts Emily Kwong and Regina Barber delve into the environmental challenges posed by the rapid expansion of artificial intelligence (AI). This episode, the second part of a miniseries, explores the significant energy demands of AI technologies, particularly large language models, and investigates innovative strategies aimed at reducing their ecological footprint.
Sasha Luccioni’s Personal Crisis and Mission
The episode opens with the compelling story of Sasha Luccioni, a computer scientist who experienced a profound climate anxiety that led her to leave her role as an AI researcher at Morgan Stanley in 2018.
“Essentially was getting more and more climate anxiety. I was really feeling this profound disconnect between my job and my values and the things that I cared about.” — Sasha Luccioni [01:52]
Motivated by her concerns, Sasha joined Hugging Face as the climate lead, dedicating her efforts to developing more sustainable AI models. Her journey underscores a growing movement within the tech community that seeks to balance AI innovation with environmental responsibility.
The Energy Footprint of AI
Emily and Regina highlight alarming projections regarding the energy consumption of data centers, which are the backbone of AI technologies.
By 2028, data centers in the United States alone could consume up to 12% of the nation’s electricity, equating to 580 terawatt hours ([02:32]). To put this into perspective:
“U.S. data centers alone could someday use a Canada size amount of energy.” — Regina Barber [02:46]
This surge is largely driven by the prevalence of large language models powering virtual assistants like ChatGPT, Microsoft Copilot, and Google Gemini. These models require extensive computational resources, leading to increased energy and water usage for cooling data centers.
Tech Giants’ Commitment to Nuclear Energy
In response to the escalating energy demands, major tech companies—Google, Meta, Microsoft, and Amazon—have set ambitious goals to achieve net-zero carbon emissions, with most aiming for 2030 and Amazon targeting 2040.
The discussion turns to three primary strategies these companies are employing to power their data centers sustainably:
-
Nuclear Power Expansion
The tech giants are investing in nuclear energy to significantly boost global supply by 2050. For instance, Microsoft plans to revive the Three Mile Island nuclear plant in Pennsylvania.
“Now you're just cooling the surface of whatever the cold plate is covering rather than just blowing air through the entire machine.” — Benjamin Lee [09:42]
This shift marks a transformation where AI companies are stepping into roles traditionally held by energy firms, striving to secure a more sustainable energy future.
Innovations in Data Center Cooling
Beyond energy sourcing, improving the efficiency of cooling systems in data centers is crucial. The episode explores two main cooling innovations:
-
Free Air Cooling Systems
This method involves designing data centers to utilize ambient cool air from the environment, eliminating the need for energy-intensive chillers.
-
Liquid Cooling Technologies
Liquid cooling uses synthetic fluids to absorb and dissipate heat more efficiently than traditional air cooling. David Craig, the retired CEO of the company Isotope, explains:
“We are always that kid who does touch the very hot ring on the cooker when our mum said don't. We are always the people who touch the wet paint.” — David Craig [15:26]
Liquid cooling can reduce energy usage by up to 40% compared to air cooling and eliminates water consumption, making it a pivotal innovation for green data centers.
Additionally, the concept of district heating—where excess heat from data centers is redirected to warm nearby communities—demonstrates practical applications of these cooling technologies. For example, Google’s data center in Finland, Homina, supplies heat to 2,000 residents ([11:36]).
The Push for Smaller AI Models
While improving energy sources and cooling systems are essential, another critical strategy discussed is the development of smaller, more efficient AI models. Sasha Luccioni advocates for:
“Nowadays, more and more, I think companies especially are like, for our intents and purposes we want to do X, like whatever, summarize PDFs, but you don't need a general purpose model for that.” — Sasha Luccioni [13:39]
Smaller models require fewer computational resources, thereby reducing energy consumption. Companies like Deep Seq are pioneering this approach by creating AI models with fewer parameters and training requirements:
“The idea behind a mixture of experts is you don't need a single huge model with a trillion parameters to answer every possible question under the sun.” — Benjamin Lee [13:10]
Despite these innovations, large language models remain dominant, as evidenced by the development of smaller variants by tech giants like Meta and Microsoft.
Towards an Industry-Wide Energy Rating for AI Models
Sasha Luccioni proposes the creation of an industry-wide energy rating system for AI models, akin to the Energy Star ratings for appliances. However, she notes resistance from tech companies:
“There's like such a blanket ban on any kind of transparency because it could either like make you look bad, open you up for whatever legal action, or just kind of give people a sneak peek behind the curtain.” — Sasha Luccioni [14:48]
This lack of transparency hampers efforts to assess and improve the environmental impact of AI technologies systematically.
Conclusion: Balancing AI Advancement with Environmental Responsibility
Emily Kwong emphasizes the urgency of addressing AI's environmental impact, likening the current AI revolution to the discovery of electricity but with a known climate cost.
“Google CEO Sundar Pichai compared it to the discovery of electricity. Except unlike the people during the Industrial Revolution, we know that this has a climate cost.” — Emily Kwong [16:20]
Regina Barber adds a touch of humor while highlighting the impracticality of halting technological progress entirely:
“Where would we get our cat videos in the mail?” — Regina Barber [15:57]
The episode concludes on a hopeful note, suggesting that with continued innovation and conscious efforts, it is possible to steer AI development towards a more sustainable and environmentally friendly future.
Key Takeaways
-
AI’s Growing Energy Demand: Large language models are significantly increasing data centers' energy consumption, projected to reach 12% of U.S. electricity usage by 2028.
-
Tech Companies’ Strategies: Major tech firms are investing in nuclear energy and developing more efficient cooling systems to mitigate AI’s environmental impact.
-
Innovation in Cooling Technologies: Liquid cooling and free air cooling systems offer substantial energy savings and reduce water usage in data centers.
-
Advocacy for Smaller AI Models: Reducing the size and complexity of AI models can lead to lower energy consumption without sacrificing functionality.
-
Need for Transparency: Implementing an energy rating system for AI models could drive industry-wide improvements, though resistance from tech companies poses a significant challenge.
Notable Quotes with Timestamps
-
“Essentially was getting more and more climate anxiety...” — Sasha Luccioni [01:52]
-
“U.S. data centers alone could someday use a Canada size amount of energy.” — Regina Barber [02:46]
-
“Now you're just cooling the surface...” — Benjamin Lee [09:42]
-
“We are always that kid who does touch the very hot ring...” — David Craig [15:26]
-
“Nowadays, more and more, I think companies...” — Sasha Luccioni [13:39]
-
“There's like such a blanket ban on any kind of transparency...” — Sasha Luccioni [14:48]
-
“Google CEO Sundar Pichai compared it to the discovery of electricity...” — Emily Kwong [16:20]
-
“Where would we get our cat videos in the mail?” — Regina Barber [15:57]
This episode of Short Wave provides a thorough examination of the environmental challenges posed by AI and the multifaceted approaches being explored to make AI technologies more sustainable. Through engaging storytelling and expert insights, Emily Kwong and Regina Barber illuminate the path toward a greener AI future.
