Society

AI’s hidden costs are accelerating the climate crisis

While AI has helped make progress in addressing serious climate issues, it appears its harms may outweigh its benefits
Cover Image for AI’s hidden costs are accelerating the climate crisis
Matt Palmer/Unsplash

Every time you ask your virtual assistant to play a song, suggest a restaurant, or answer a question, you are connecting to a massive network of artificial intelligence systems. Few people recognize that these seemingly innocent behaviors are part of a larger environmental issue: AI is quietly increasing climate change.

While AI has helped make progress in addressing serious climate issues by regulating heating and cooling in buildings and predicting natural disasters, it appears its harms may outweigh its benefits. AI systems rely on enormous data centers that run around the clock and require vast quantities of electricity, much of which is still sourced from fossil fuels. While artificial intelligence promises advances in productivity, its environmental impact is often overlooked in the pursuit of technological development. 

The process of training AI models is one of the most energy-intensive steps of the AI lifecycle. Large-scale models, such as OpenAI’s GPT-3, require expansive computational power, with thousands of processors operating for days or even weeks to process massive datasets. For example, training GPT-3 apparently produced as much carbon as five vehicles over their whole lives. With the competitive quest to produce larger and more complicated models, these energy demands will only rise. Every repetition raises the environmental cost, challenging the claims of AI-driven efficiency and innovation. Training large AI models like GPT-3 used around 1,300 megawatt-hours (MWh) of electricity.

AI uses huge data centers to store, process, and transfer the data that drives its applications. In 2022, data centers consumed around 1-2% of global electricity, a number that is expected to rise as AI adoption grows. The environmental impact is particularly severe in areas where energy systems rely on coal or other fossil fuels. For example, coal-powered data centers have been used to support AI systems in the United States and China, greatly increasing the carbon footprint of AI systems. Without renewable energy, AI’s dependence on fossil fuels may greatly increase greenhouse gas emissions. Training one deep learning model can produce as much CO₂ as five cars do throughout their lifetimes. 

Beyond training, AI systems use substantial amounts of energy during real-time operation. Virtual assistants like Siri and Alexa, chatbots, and AI recommendation engines depend on cloud servers to manage millions of user interactions daily. Every inquiry or contact might appear minor, but the combined request from millions, even billions, of users quickly accumulates. A single Google search uses roughly 0.0003 kilowatt-hours (kWh) of energy. About 8.5 billion searches happen daily, consuming roughly 2.55 million kWh of energy each day. That’s enough energy to power around 85,000 U.S. homes for a day, based on an average consumption of 30 kWh per household.

As AI integrates into daily life, its energy consumption increasingly threatens sustainability and climate goals.

The far-reaching environmental toll

The hardware that drives AI, from specialized processors to data center infrastructure, has its own set of environmental effects. From resource extraction to disposal, the AI hardware lifecycle adds to pollution, habitat damage, and an escalating e-waste problem.

Despite advancements in renewable energy, AI still has a significant environmental impact from hardware production, which requires energy-intensive mining of rare earth metals. In 2020, global e-waste reached 53.6 million metric tonnes, but only 17.4% was recycled. While this figure includes all electronic waste, AI-related hardware contributes to this growing issue.

AI development requires specialized hardware, such as Graphics Processing Units (GPUs), Tensor Processing Units (TPUs), and other processors, to meet the immense computing needs of machine learning. These components are made by extracting rare earth metals like lithium, cobalt, and nickel. Mining these resources is energy-intensive and harmful to the environment, often causing deforestation, water pollution, and high carbon emissions. The carbon footprint of mining and refining rare earth metals rivals entire industrial sectors, making hardware manufacturing one of AI’s most carbon-heavy activities. Producing these components requires a lot of energy and greatly adds to greenhouse gas emissions. In 2021, the global semiconductor industry emitted around 76.5 million metric tons of CO₂ equivalent, with roughly 80% of these emissions coming from electricity used in manufacturing, according to a study published on HAL Science and analysis by the Boston Consulting Group. This emission level is similar to the yearly carbon footprint of countries such as Chile or Finland.

As newer, more advanced hardware becomes available, older components rapidly become obsolete. This generates a continual stream of abandoned electronics, most of which wind up in landfills or badly recycled. AI-related technology, such as GPUs and data center equipment, is especially hazardous because of its complexity and toxic contents, which include heavy metals and non-biodegradable polymers. According to a study published in the journal Nature Computational Science, the rapid use of large language models (LLMs) is expected to lead to 2.5 million tonnes of e-waste by 2030. This waste connects not just to the servers and hardware for training and deploying LLMs, but also to the devices and infrastructure that depend on them, like smartphones, smart speakers, and other AI gadgets.

Improper e-waste disposal harms the environment and human health. Old gadgets frequently find their way to landfills, where rainwater can leach toxic substances like lead and mercury into the soil, eventually contaminating rivers or groundwater. Pollutants can taint water sources, impacting fish and wildlife. Mercury can build up in fish, which larger animals and humans then consume, spreading through the food chain. Soil lead contamination harms plants and animals, disrupting ecosystems and lowering biodiversity. Mercury poisoning in fish has caused declines in fish-eating birds, such as ospreys, in polluted regions. 

​​Experts caution that the swift rise of AI, especially generative AI applications, may greatly boost electronic waste. A study from Scientific American estimates that generative AI could contribute 1.2 to 5 million metric tons of e-waste each year by 2030. This projection raises significant concerns for environmental researchers and organizations regarding sustainability and the ethical disposal of electronic components.

The double-edged sword of AI in the climate crisis

Artificial intelligence plays a crucial role in fighting climate change, helping to optimize renewable energy and predict natural disasters like hurricanes, floods, and wildfires. AI analyzes extensive data to offer early warnings and accurate risk assessments, aiding communities in preparing for and reducing the impacts of events. This promise is overshadowed by a key contradiction: the technology designed to address environmental problems also contributes to them through energy-intensive processes. AI has made great progress in solving some of the most serious climate issues. One of its most important applications is to optimize renewable energy systems. AI algorithms can balance energy system supply and demand by forecasting fluctuations in solar and wind power generation, ensuring efficiency and reliability. For example, Google’s DeepMind has used AI to enhance the efficiency of wind turbines, increasing their energy output by predicting wind patterns.

In addition to renewable energy, AI is helping improve energy efficiency in buildings. Smart AI-powered systems evaluate energy usage patterns and regulate heating, cooling, and lighting in real time, decreasing waste and emissions. Cities all over the world are using these technologies to develop more sustainable neighbourhoods.

AI also plays a crucial role in disaster prediction and mitigation. Machine learning models use massive amounts of data to predict the risk of natural disasters such as hurricanes and floods. These technologies assist governments and communities in preparing for such occurrences by giving early warnings and precise risk assessments, thereby saving lives and lowering the economic and environmental costs.

However, AI’s environmental cost cannot be ignored. The same technology that enhances renewable energy grids and improves efficiency is itself a significant energy consumer. Training and operating AI models require vast computational resources, which often depend on electricity derived from fossil fuels. This creates a paradox: we rely on energy-intensive AI to solve energy-related problems, potentially undermining its benefits.

The discussion on AI’s advantages and its environmental impact is complex and context-dependent. AI in renewable energy and resource management could balance its carbon footprint through efficiency gains. The energy demands for creating and sustaining systems are increasing quickly. Shifting to renewable energy for AI is essential; otherwise, the balance between its benefits and environmental costs could tip negatively. AI provides notable benefits, like enhancing renewable energy systems and boosting efficiency, yet its environmental impact is considerable. Data centers crucial for AI operations use about 1% of global electricity, with some areas like Ireland surpassing 20% of total electricity consumption. AI is expected to make up 19% of data center power demand by 2028, highlighting the urgent need to shift to renewable energy to avoid unsustainable trade-offs.

Building a greener future for AI

Artificial intelligence can change our world, but we must tackle its environmental impact for sustainable growth. Transitioning to energy-efficient data centers powered by renewable energy involves adopting greener infrastructure, developing efficient algorithms, designing AI models that need less computational power, and implementing policies that support sustainable practices. Collaboration among industries for innovation, supportive regulations from governments, and consumers adopting eco-friendly habits, like minimizing unnecessary AI use, are essential for success. 

Transitioning data centers to renewable energy sources is a highly effective way to lessen AI’s environmental impact. Tech giants such as Google, Amazon, and Microsoft are at the forefront, aiming to run their operations fully on renewable energy. Since 2017, Google has aligned its energy use with renewable sources and plans to operate all its data centers on carbon-free energy by 2030. To achieve this, Google uses wind, solar, and geothermal energy. For example, it has partnered with Fervo Energy to develop geothermal power plants and co-locates data centers with solar and wind farms to ensure clean energy availability. Microsoft is also investing in carbon-negative initiatives and using renewable energy to enhance its AI infrastructure. It plans to achieve carbon negativity by 2030, with all data centers running on 100% renewable energy by this year. (As of 2023, Microsoft had reached 70% renewable energy usage.)

And finally, reducing AI’s negative effects on the environment also comes down to us individually. Teaching users about the environmental effects of their tech use is crucial for a sustainable future. Small changes, such as cutting back on virtual assistant queries or turning off energy-heavy features when not needed, can lower the demand on data centers. A single person making fewer queries to a virtual assistant may seem small, but when multiplied by millions of users, the energy savings can be significant.

Sannah Chawdhry is a journalist based in Canada. She holds a degree in journalism from Mount Royal University in Calgary.

Share