Marissa Loewen began using artificial intelligence (AI) back in 2014 to help manage her tasks. As someone with autism and ADHD, she found it incredibly useful for organizing her thoughts. However, she’s also aware of the environmental impact. “We try to use it mindfully because we know it affects the planet,” she shared.
AI isn’t just a personal tool anymore; it’s become woven into our daily lives through smartphones, search engines, and emails. Every interaction comes with a hidden cost, mainly through energy use that often stems from fossil fuels, releasing greenhouse gases that contribute to climate change.
As more people rely on AI, the demand for data centers increases. These facilities work tirelessly to process requests and store information. Unfortunately, many of them rely on fossil fuels, making it harder to shift towards renewable energy. Noman Bashir, a fellow at MIT’s Climate and Sustainability Consortium, explains, “We’re building them too quickly to integrate renewable energy,” leading to a rise in carbon emissions.
Data centers also generate heat, requiring vast amounts of water for cooling. According to the Environmental and Energy Study Institute, larger centers can use up to 5 million gallons of water daily—enough for a town of about 50,000 people.
Expert opinions shed light on AI’s hidden costs. Sasha Luccioni, AI and Climate Lead at Hugging Face, found that creating a high-definition image with AI consumes as much energy as charging half a smartphone. It’s surprising for many users who think they’re not doing much.
Jon Ippolito, a new media professor at the University of Maine, points out that tech companies are innovating for efficiency. However, this doesn’t necessarily mean a reduced environmental impact. He explains a principle called the Jevons Paradox: as resources become cheaper, our usage often increases.
How much energy exactly does AI consume? It varies based on several factors, including the temperature around the data center and the grid’s cleanliness. Ippolito built an app to evaluate the energy costs of different digital tasks. For example, asking AI a simple question can use 23 times more energy than a basic search engine query. For complex prompts, the energy use can skyrocket even more—up to 15,000 times more for a short video.
Despite AI’s growing footprint, traditional digital activities also consume significant energy. Watching an hour of Netflix expends more energy than a complex AI task, suggesting that we should reconsider our entire digital consumption habits.
To lessen the environmental impact, Ippolito recommends limiting AI usage. He encourages using regular images instead of AI-produced ones and turning off AI features when not necessary. Similarly, Loewen tries to consolidate her queries into a single prompt to save energy. She even built her own AI to minimize reliance on extensive data centers.
Others, like Luccioni, prefer using search engines like Ecosia, which donates profits to plant trees, thus offsetting some of the carbon footprint from searches. ChatGPT offers features where data gets deleted after a few weeks to save space on servers.
Interestingly, Ippolito notes that AI accounts for only about 15% of the energy consumed by data centers. The majority comes from social media and cryptocurrency data collection. Limiting time spent on platforms like TikTok can significantly reduce personal digital footprints.
In summary, while AI enhances our lives, it’s crucial to be mindful of its environmental consequences. By making small changes in how we use technology, we can collectively reduce our impact on the planet. Thus, our digital habits should be scrutinized alongside our use of AI.
Source link
AI climate impact, data centers and climate change, energy consumption of AI, environmental effects of AI, AI, artificial intelligence, sustainable AI practices, Hugging Face, zoom, midjourney

