• Business Business

New report reveals concerning details about Elon Musk's newest AI tech — here's what you need to know

Policymakers are weighing new rules.

The latest version of Elon Musk's Grok AI model is pushing the limits of energy and water usage for large language models.

Photo Credit: iStock

The latest version of Elon Musk's Grok AI model is pushing the limits of energy and water usage for large language models, according to a recent piece by Dagens.com. Training the system reportedly required nearly $500 million in resources and consumed enough electricity and water to power a small city, raising new concerns about the ongoing environmental toll of next-generation AI. 

What's happening?

Developed by Musk's company xAI, Grok 4 was trained on its Memphis-based Colossus supercomputer, running nonstop for months on end, according to the report. The effort consumed about 310 gigawatt-hours of electricity, equivalent to powering around 4,000 homes for one year and releasing 154,000 tons of carbon pollution — about the same as a commercial jet releases over three years' worth of flights. 

Cooling the data centers also required 754 million liters of water, enough to fill around 300 Olympic-sized swimming pools. 

Total training costs came in around $490 million, underscoring the escalating financial and environmental expense of AI. 

xAI later released Grok 4 Fast, a streamlined version that uses 40% fewer "thinking tokens" and costs 98% less to operate in an attempt to reduce the monetary and environmental impact. 

Why is this resource usage important?

These staggering numbers show how artificial intelligence has become a major — and often hidden — driver of global energy demand. 

FROM OUR PARTNER

Perk up the winter blues with natural, hemp-derived gummies

Camino's hemp-derived gummies naturally support balance and recovery without disrupting your routine, so you can enjoy reliable, consistent dosing without guesswork or habit-forming ingredients.

Flavors like sparkling pear for social events and tropical-burst for recovery deliver a sophisticated, elevated taste experience — and orchard peach for balance offers everyday support for managing stress while staying clear-headed and elevated.

Learn more

Training LLMs requires massive amounts of electricity, much of it coming from dirty fuel-powered grids, which directly contributes to planet-warming pollution. Data centers also draw heavily on local water supplies to stay cool, straining regions already vulnerable to drought. 

At the same time, AI's connection to the environment can be complex. The same technology that's consuming enormous resources can also be part of the solution — AI can optimize renewable energy grids and accelerate sustainability breakthroughs. But without oversight, its growth could outpace global decarbonization efforts. 

What's being done about it?

Google, Microsoft, and OpenAI have all begun investing more in renewable-powered data centers to design more efficient models. xAI's release of Grok 4 Fast has also shown that energy reductions are possible when optimization is a priority. 

Policymakers continue weighing new transparency rules that would require AI developers to disclose their energy and water use. 

Ultimately, building a cleaner digital future means making AI smarter, and that includes how lightly it treads on the planet in the process. 

Who should be paying for space exploration?

The government 🏛️

Private companies 💰

A mix of both 🤝

It's a waste of money 👎

Click your choice to see results and speak your mind.

Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet.

Cool Divider