In part two of our series on the environmental impact of generative artificial intelligence (AI), we focus on how experts are working to lower this technology’s carbon footprint.
Generative AI’s energy consumption is set to rise sharply over the next decade. A report from the International Energy Agency (IEA) predicts that global electricity demand from data centers will double by 2030, reaching around 945 terawatt-hours. That’s a figure slightly higher than Japan’s total energy use. Notably, Goldman Sachs estimates that fossil fuels will fulfill 60% of this rising demand, potentially increasing global carbon emissions by about 220 million tons—equivalent to the emissions from driving a gas-powered car for 5,000 miles.
While these numbers are alarming, researchers at MIT and worldwide are actively exploring ways to reduce AI’s carbon footprint. They look at both “operational carbon” (emissions from running data centers) and “embodied carbon” (emissions from building those data centers). According to Vijay Gadepally, a senior scientist at MIT Lincoln Laboratory, much attention is often paid to operational carbon, while the emissions related to constructing data centers are overlooked. Building these facilities involves a lot of steel and concrete, which have significant carbon footprints. Companies like Meta and Google are even exploring sustainable building materials to mitigate these emissions.
Data centers are massive. The largest one in the world, for instance, spans around 10 million square feet and has energy density levels far greater than typical office buildings. “The operational side is only part of the story,” Gadepally states, emphasizing the need to address embodied carbon as well.
To lower operational emissions, Gadepally draws parallels with home energy-saving tips. Simple actions can lead to big changes. Just like turning off lights can save energy at home, reducing the power used by GPUs in data centers can significantly cut energy use without harming performance.
Research shows that a slight reduction in GPU power can keep operational efficiency high while easing cooling demands. Switching to less energy-intensive hardware can also yield substantial benefits. For example, studies estimate that AI training, especially for demanding tasks, could use fewer GPUs with adjusted precision.
Recent innovations also focus on early stopping in AI training. Roughly half of the electricity spent training a model contributes to small accuracy increases. Stopping early can save a lot of energy, particularly for applications where slight accuracy gains aren’t critical.
Innovation in hardware is another key area. Although energy efficiency in chips has been slowing down for over a decade, GPUs are still improving in efficiency by 50-60% yearly. Neil Thompson from MIT points out that this trend allows for notable improvements in performance while minimizing energy use. His research shows that advancements in model architecture can double efficiency gains every eight to nine months. These algorithmic tweaks are referred to as “negaflops”—computational savings achieved without extra processing.
To maximize energy savings, experts highlight that not all energy sources are created equal. The carbon emissions from one kilowatt-hour can change based on the time of day or year. By smartly scheduling AI workloads to align with times when renewable energy is more available, data centers can dramatically lower their carbon footprints.
MIT’s Deepjyoti Deka suggests potential energy storage solutions as game-changers. These systems could enable data centers to use renewable energy stored during peak hours, reducing reliance on fossil fuels and backup generators.
Researchers are also examining how to improve the connection of renewable energy systems to the grid using AI. Innovations could speed up this often-lengthy process, allowing for cleaner energy adoption to catch up with AI’s rapid growth.
Current efforts are not just about refinement; they consider long-term impacts as well. For instance, a collaborative project between MIT and Princeton aims to guide companies on the best locations for new data centers to minimize environmental impacts.
At the end of the day, the goal is clear: we must innovate quickly to ensure that AI technologies become less carbon-intensive. As Jennifer Turliuk from MIT puts it, “Every day counts. This is a once-in-a-lifetime opportunity to innovate and make a difference.”
For more insights into the relationship between AI and renewable energy, check out the Net Climate Impact Score framework, which considers the emissions and other environmental costs of AI projects.
By addressing these challenges head-on, we can pave the way for a more sustainable future, minimizing the detrimental effects of generative AI.

