Site icon TechRound

How Much Electricity Will AI Need By 2030?

The International Energy Agency has predicted that the electricity used by data centres worldwide could more than double by 2030. This is mostly due to artificial intelligence. The IEA’s report says demand could reach around 945 terawatt/hours by the end of the decade, which is slightly more than Japan uses in a year. In the United States, data centres may use more power than all heavy industries together, including steel and cement.

Electricity use is rising because AI systems rely heavily on data centres to operate. These centres have powerful servers that train and run large AI models. The United States and China are investing heavily in this space, with countries rushing as an attempt to keep up with the demands of AI.

The IEA’s report adds that advanced economies could see over 20% of their electricity growth come from data centres. In Japan, that figure could exceed 50%. Malaysia could reach 20 percent. These increases are certainly a change from recent years, when electricity use in many countries had flattened or even declined.

 

What Are Companies Doing To Avoid Using More Energy?

 

Some tech companies and researchers are trying to make AI less wasteful. Mosharaf Chowdhury from the University of Michigan said there are ways to reduce electricity use without slowing down computing. His lab created software that calculates how much energy each chip needs, cutting energy use by up to 30%.

Better cooling systems are also helping, because in the past, cooling data centres took as much energy as the servers themselves. Today, that number is closer to 10%, said Gareth Williams from Arup, an engineering consultancy. The change is thanks to more precise temperature controls, often powered by AI.

Liquid cooling is another method that has caught attention. Instead of using air conditioning, this method circulates a coolant directly through the hardware.

AWS, Amazon’s cloud computing division, recently shared that it had developed its own liquid cooling method for Nvidia chips, allowing it to avoid rebuilding existing data centres. AWS’s Dave Brown explained this in a company video.

 

 

Are Newer Chips Helping?

 

New computer chips tend to be more energy efficient than older ones. McKinsey’s Pankaj Sachdeva sees this as a reason to be cautiously optimistic. But there are limits, of course…

Yi Ding from Purdue University said that although modern chips can last longer and still perform well, companies are unlikely to encourage people to delay upgrades. That would hurt their business models.

Even with more efficient chips, total electricity use will continue to climb. Ding explained that AI is spreading too fast for these savings to keep up. Though each chip might use less energy, the number of chips being used is growing quickly.

In January, Chinese company DeepSeek developed an AI model that used less powerful chips but still performed as well as leading US systems. The team achieved this through more efficient programming and by skipping a step in training that uses a lot of power. This has raised concerns in the US about China gaining an edge in energy access and efficiency.

 

Can AI Help With Energy Problems Instead Of Making Them Worse?

 

Even though AI drives up electricity demand, it also plays a big part in managing the energy system itself. The IEA’s report found that AI tools are now being used by energy companies to predict demand, manage supply, and protect against cyberattacks. Attacks on energy infrastructure have tripled in the last four years, and AI is becoming an important defence tool.

There are also hopes that maybe AI could speed up work on clean energy. The report says that AI is helping scientists improve technologies like batteries and solar panels. These tools could eventually help manage the emissions caused by running large AI models.

Exit mobile version