AI is exploding, but behind it is an increasingly worrying energy problem.
Behind the AI boom lies a little-talked-about problem: it is consuming such a huge amount of electricity that many experts warn that, without a solution, the industry could become one of the main causes of global energy resource depletion .
The Energy Cost of an AI Model
Chatbots like ChatGPT, Gemini, and Claude may seem like software applications, but they require a network of supercomputers around the world to operate. Every time a user asks a question, millions of calculations are performed in data centers, consuming huge amounts of electricity.
A study by MIT Technology Review found that training a large AI model can consume more energy than the average small city consumes in a year. For example, training GPT-4 — OpenAI’s AI model — consumed the equivalent of 175,000 US homes in a day.
This situation is not only happening in the US. In the UK, a report from The Guardian revealed that if the AI industry continues to grow at its current rate, it could account for up to 10% of total global electricity consumption by 2030.
At a Stanford conference, a researcher once came up with a shocking number: if OpenAI scaled their system to serve the entire world’s population, ChatGPT alone would require enough electricity to run a medium-sized country like Argentina.
Data centers are hot like “furnaces”
To put this into perspective, consider data centers — where all the AI computing happens. These massive buildings house hundreds of thousands of servers, running around the clock, generating massive amounts of heat.
In Phoenix, Arizona, one of the largest data centers in the United States, engineer Michael Green recalls a serious incident: “One day, the cooling system failed. In less than an hour, the temperature of the server room skyrocketed to over 500C. If not promptly resolved, millions of dollars of data and equipment could have been destroyed.”
It’s not just the temperature that matters, water is also a factor. AI data centers consume huge amounts of water for cooling, according to a report by The New York Times. A large Google data center in Iowa can use up to 1 billion liters of water per year, equivalent to the water needs of a city of 10,000 people.
In many parts of the world, this is causing conflicts over resources. In the Netherlands, the government was forced to intervene after locals protested that a Microsoft data center was drawing water from the public supply to cool its systems.
How do tech companies respond?
Aware of the problem, tech giants are looking for solutions.
Google has announced it will switch to 100% renewable energy for its data centers by 2030. Amazon is investing billions in wind and solar farms. Microsoft is even testing underwater data centers, where naturally cold water reduces the need for cooling.
But are these efforts enough to offset AI’s ever-increasing power consumption?
“Even if we shift to renewable energy, AI will still consume more resources than we can regenerate in the short term,” said Dr. Kate Crawford, an AI expert at the University of Southern California. “It’s not just electricity, it’s water, precious metals, and semiconductors.”
Can AI save itself?
An interesting question arises: Can AI help reduce its own energy consumption?
The answer may be yes. Scientists are developing more efficient AI that requires less energy to operate. New techniques like quantum AI, dedicated AI chips (like Google’s TPU line or Apple’s M-series) are helping to reduce power consumption.
A prime example is DeepMind, Google’s AI company, which used AI to optimize data center cooling systems. The result was a 40% reduction in cooling energy consumption, a significant improvement.
However, Dr Geoffrey Hinton, one of the “fathers” of modern AI, warns: “Improving performance is important, but if demand for AI continues to explode as it is now, a 40% savings will not be enough to solve the problem in the long term.”
How far will AI develop?
When John, a tech engineer, finished his shift, he closed his laptop and looked out the window. The city was still bright with lights, the streets were still full of cars. He wondered: if AI is really the future of humanity, will it push us into a new energy crisis?
The AI revolution is still in progress, and the question of its energy consumption is still unclear. One thing is certain: we are entering a period where technology will not only change our lives, but also potentially change the way the world works.
AI may be the smartest machine ever created by humans, but without control over the energy it consumes, is it really a solution, or just a new problem?