Consider This from NPR: AI and the Environment
Release Date: March 30, 2025
Host: Emily Kwong
Introduction
In the episode titled "AI and the Environment," NPR's Emily Kwong delves into the burgeoning relationship between artificial intelligence (AI) and its environmental footprint. As AI technologies advance at an unprecedented pace, concerns over energy and water consumption, as well as carbon emissions, have come to the forefront. The episode explores the challenges and solutions associated with making AI more sustainable, featuring insights from experts and industry leaders.
The Personal Impact: Sasha Luccioni's Journey
Emily Kwong opens the discussion with the story of Sasha Luccioni, an AI researcher who initially joined Morgan Stanley in 2018. Despite her excitement about working in AI, Luccioni grappled with "climate anxiety" due to the disconnect between her professional role and personal values.
Sasha Luccioni (00:10): "I essentially was getting more and more climate anxiety. I was really feeling this profound disconnect between my job and my values and the things that I cared about."
Facing this internal conflict, Luccioni contemplated leaving her job to engage in more environmentally impactful work, such as tree planting. However, encouraged by her partner, she decided to leverage her AI expertise to contribute positively to environmental sustainability.
Sasha Luccioni (00:36): "Maybe you can use that to make a difference in the world."
Ultimately, Luccioni transitioned to a movement focused on making AI more sustainable, highlighting a growing trend among professionals seeking to align their careers with environmental stewardship.
AI's Growing Environmental Footprint
Since 2022, the AI sector has experienced exponential growth, leading to a significant surge in energy consumption. The construction of data centers—vast infrastructures housing hundreds of thousands of computers—has intensified this impact.
Lawrence Berkeley National Laboratory projects that by 2028, data centers could consume up to 12% of the United States' electricity.
Benjamim Lee (01:36): "Under current infrastructure investment plans, you could possibly achieve those net zero goals."
Additionally, AI advancements have exacerbated water consumption concerns. As pointed out by David Craig, the water usage associated with AI operations is "astonishing."
David Craig (01:08): "The amount of water that AI uses is astonishing."
This heightened resource demand has sparked widespread discussions on social media and among environmental advocates about the sustainability of AI technologies.
Big Tech's Climate Commitments and Challenges
The episode examines the climate goals set by major tech companies—Google, Microsoft, Amazon, and Meta—all of whom aim to reach net-zero carbon emissions by as early as 2030, with Amazon targeting 2040. These companies are exploring various strategies to achieve these targets.
One prominent approach is investing in nuclear energy. In a significant move, Amazon, Meta, and Alphabet (Google's parent company) have signed agreements to support the tripling of the global nuclear supply by 2050. Microsoft has committed to purchasing power from the historic Three Mile Island nuclear plant in Pennsylvania, symbolizing a commitment to revitalize nuclear energy sources.
Emily Kwong (04:37): "Microsoft has committed to buying power from an old nuclear plant on Three Mile Island in Pennsylvania."
However, integrating nuclear energy with AI infrastructure presents unique challenges. Benjamin Lee from the University of Pennsylvania highlights the difficulty in reconciling the rapid innovation pace of Silicon Valley with the conservative and risk-averse nature of the nuclear industry.
Benjamin Lee (02:58): "Generative AI refers to the AI that uses large language models."
Benjamin Lee (06:09): "As you can imagine, it's much more efficient because now you're just cooling the surface of whatever the cold plate is covering rather than just blowing air through the entire machine."
The lengthy and costly process of developing nuclear energy infrastructure contrasts sharply with the faster, more cost-effective deployment of renewable energy sources like solar and wind. This discrepancy poses a significant hurdle for big tech companies striving to meet their sustainability goals amidst a rapidly expanding AI landscape.
Innovative Solutions: Enhancing Data Center Efficiency
To mitigate the environmental impact, tech companies are investing in making data centers more energy-efficient. One such innovation is liquid cooling—a method that uses synthetic fluids to absorb and dissipate heat from servers more effectively than traditional air cooling systems.
David Craig, the retired CEO of Isotope, emphasizes the benefits of liquid cooling:
David Craig (06:25): "With liquid cooling, a lot of the heat stays in the system and computers don't have these massive swings in temperature."
Liquid cooling not only reduces energy consumption but also extends the lifespan of hardware by minimizing thermal stress and mechanical wear from fans. Companies like Isotope are partnering with industry giants such as Hewlett Packard and Intel to scale this technology, with some Meta data centers expected to implement liquid cooling by 2026.
Despite its advantages, liquid cooling remains an expensive solution, limiting its widespread adoption. Nonetheless, it represents a critical step toward more sustainable AI operations.
Smaller AI Models: A Path to Reduced Energy Consumption
Another promising avenue for reducing AI's environmental footprint is the development of smaller, task-specific language models. Unlike large language models (LLMs) like ChatGPT, which require immense computational resources, smaller models are designed to handle specific tasks with significantly lower energy demands.
Sasha Luccioni (08:57): "Nowadays more and more I think companies especially are like, well actually for our intents and purposes we want to do X, like whatever, summarize PDFs."
Benjamin Lee describes this approach as a "mixture of experts," where a collection of smaller, specialized models collectively perform tasks more efficiently than a single, large model.
Benjamin Lee (08:23): "You don't need a single huge model with a trillion parameters to answer every possible question under the sun."
Benjamin Lee (08:46): "Because each expert is so much smaller, it's going to cost less energy to invoke."
Companies like Meta, Microsoft, and Amazon are experimenting with smaller models to balance AI capabilities with sustainability. Innovations such as Deepseek's chatbot in China exemplify efforts to develop energy-efficient AI solutions, though some skepticism remains regarding their scalability and true energy savings.
Towards Industry-Wide Sustainability Standards
Despite these advancements, there is a pressing need for standardized measures to evaluate the energy efficiency of AI models. Sasha Luccioni advocates for an industry-wide score for AI models, akin to the Energy Star ratings for household appliances, to promote transparency and accountability.
Sasha Luccioni (10:26): "We're having a lot of trouble getting buy-in from companies. There's like such a blanket ban on any kind of transparency because it could either like make you look bad, open you up for whatever legal action, or just kind of give people a sneak peek behind the curtain."
However, resistance from tech companies, concerned about reputational risks and proprietary information, has hindered progress toward such standardized evaluations.
The Future of AI and Environmental Sustainability
As the AI revolution continues to unfold, the episode poses a critical question: Do we truly need the vast computing power that AI demands, given its potential to undermine climate goals? While some, like David Craig, express skepticism about humanity's willingness to change behaviors for sustainability, others remain hopeful.
David Craig (11:00): "We're always that kid who does touch the very hot ring on the cooker when her mum said, don't you know we are always the people who touch the wet paint sign and stuff, right?"
Google CEO Sundar Pichai draws a parallel between AI and the discovery of electricity, emphasizing that unlike past technological revolutions, we are now aware of AI's environmental costs and have the opportunity to steer its development responsibly.
Emily Kwong (11:23): "Google CEO Sundar Pichai compared AI to the discovery of electricity. Except unlike the people during the Industrial Revolution, we know AI has a big climate cost, and there's still time to adjust how and how much of it we use."
The episode concludes with a call to action for both the tech industry and consumers to prioritize sustainability in the ongoing AI expansion, underscoring the delicate balance between technological advancement and environmental preservation.
Produced by: Avery Keatley and Megan Lim
Audio Engineering: Ted Mebane
Edited by: Adam Raney, Sarah Robbins, and Rebecca Ramirez
Executive Producer: Sammy Yanigun
For more science reporting, tune into Emily Kwong's co-hosted podcast, Shore Wave.
