Unveiling the Hidden Climate Cost of Everyday AI: What You Need to Know

Admin

Unveiling the Hidden Climate Cost of Everyday AI: What You Need to Know

EDMONTON, Alberta — Marissa Loewen started using artificial intelligence (AI) in 2014 to help manage her projects. Living with autism and ADHD, she found it incredibly useful for organizing her thoughts and tasks. “We try to use it responsibly because we know it impacts the environment,” she shared.

Today, AI is everywhere. It’s in our smartphones, search engines, and even in our emails. However, every time we use it, we contribute to energy consumption, mainly from fossil fuel sources, which adds to climate change.

Data centers, which power AI applications, demand immense energy to operate. As their numbers grow, so do the energy needs. Noman Bashir, a climate expert at MIT, noted, “Many new data centers depend on fossil fuels because we can’t integrate renewables fast enough.”

These data centers also need water to cool down. On average, larger centers consume about 5 million gallons daily—enough for a town of 50,000 people. The impact isn’t obvious to casual users. Sasha Luccioni, from the AI company Hugging Face, explained that generating a high-definition image can use as much energy as charging your phone. Many are surprised to learn that.

Jon Ippolito, a professor at the University of Maine, points out that while tech companies strive for efficiency, the environmental footprint of AI might not decrease due to a phenomenon known as the Jevons Paradox. He said, “As resources become cheaper, we just use them more.” For example, when cars replaced horses, travel distances increased—not decreased.

Understanding how much AI contributes to global warming is complicated. Ippolito created an app to estimate the environmental impact of various digital tasks. He found that a simple AI query, like asking for the capital of France, uses 23 times more energy than a traditional Google search without AI features. Complex queries amplify that difference. For instance, a quirky question about gummy bears uses 210 times more energy.

Interestingly, streaming services aren’t innocent either. Watching an hour of Netflix typically consumes more energy than running a complex AI prompt.

Ippolito suggests being mindful of AI’s footprint and digital habits. He limits AI usage when possible, prefers human-captured images, and initiates Google searches without AI when unnecessary. Loewen follows a similar strategy: She organizes her queries to minimize back-and-forth questions. She even built her own energy-efficient AI model that operates locally, reducing dependence on data centers.

Luccioni uses Ecosia, a search engine that plants trees using profits and allows users to turn off its AI features. ChatGPT provides a handy option to delete queries after a few weeks, reducing data storage.

While AI does consume energy, Ippolito reminds us that a significant portion of data center energy is used for data collection on social media platforms. He advises limiting time on apps like TikTok and Instagram to lessen our digital energy footprint.

“In any way you can cut out data centers, that’s a win,” he concluded. Making small changes can lead to a larger positive impact on our environment.



Source link

news,traffic,weather,sports,classifieds,cars,jobs,homes,television,radio,salt lake,utah,local,Technology,Environment,Climate,Science