MIT News has taken a closer look at the environmental impact of generative AI in a two-part series. The first article dives into why this technology consumes so many resources, while the second will explore how experts are working to lessen its carbon footprint.
Generative AI has taken the world by storm. It promises everything from boosting productivity to revolutionizing scientific research. However, the rapid rise of this technology comes with serious environmental concerns that are still difficult to fully understand and address.
Training generative AI models, like OpenAI’s GPT-4, requires massive computational power, which translates into high electricity consumption. This increased energy use not only raises carbon dioxide emissions but also puts strain on our electric grid. The impact doesn’t stop with training; using and refining these models daily continues to consume large amounts of energy.
Beyond electricity, significant water resources are needed to cool the servers that power these AI systems. This can put a strain on local water supplies and harm nearby ecosystems. The demand for more powerful computing hardware further complicates matters, as producing and transporting this hardware also carries environmental consequences.
Elsa A. Olivetti, a professor at MIT, emphasizes that the environmental effects of generative AI are broader than just electricity use. They extend to system-wide impacts based on various actions. Olivetti contributed to a 2024 paper titled “The Climate and Sustainability Implications of Generative AI,” which examines both the positive and negative effects of this technology.
One significant contributor to environmental issues is data centers, which train and operate the deep learning models that power popular tools like ChatGPT and DALL-E. These centers house numerous servers and require substantial electricity to function. As generative AI grows, so does the number of data centers. They consume seven or eight times more power compared to typical computing tasks, boosting energy demands considerably.
For instance, the energy needs of data centers in North America jumped from 2,688 megawatts at the end of 2022 to 5,341 megawatts by the end of 2023. Globally, data centers consumed around 460 terawatts of electricity in 2022, ranking them as the 11th largest electricity consumer in the world. By 2026, this figure is projected to rise even further, pushing data centers into one of the top five consumer spots globally.
The rapid construction of new data centers often relies heavily on fossil fuel-based power. For example, training a model like OpenAI’s GPT-3 requires immense amounts of electricity—around 1,287 megawatt hours, which could power about 120 average U.S. homes for a year.
Even after training, the energy demands don’t go away. Each time someone uses a generative AI model, like asking ChatGPT to summarize a document, it consumes energy—estimated to be five times higher than a typical web search. Many users don’t consider these impacts, as they rely on the ease of these AI tools without awareness of their environmental costs.
Olivetti notes that while energy consumption from training AI models is often highlighted, the water usage in data centers deserves attention as well. For every kilowatt hour consumed, around two liters of water is needed for cooling. This further impacts local ecosystems, emphasizing that data centers, despite being labeled as “cloud computing,” exist in the physical world with real consequences.
The hardware used in data centers may also have indirect environmental impacts. Manufacturing powerful processors like GPUs is complicated and energy-intensive, often resulting in a larger carbon footprint compared to simpler models. The raw materials for these components are sourced through mining processes that can be harmful to the environment.
As demand for powerful GPUs grows, so do the ecological consequences. In 2023, the major manufacturers shipped nearly 3.85 million GPUs to data centers, a significant increase from previous years. The current growth trend is unsustainable, and experts believe a shift is necessary.
To navigate these challenges, experts advocate for responsible development of generative AI. This includes understanding the environmental and societal costs, and evaluating the perceived benefits against those impacts. We need to approach the advancement of AI thoughtfully, considering the wider implications of its rapid evolution.
Source link
Elsa A. Olivetti, Noman Bashir, generative AI, Datacenters, Sustainability of generative AI, Sustainable computing