AI applications such as chatbots, search engines, and translation tools are increasingly used by businesses and consumers. To make such systems possible, thousands of powerful chips operate behind the scenes, delivering enormous computing power. These are needed to first train AI models, and then repeatedly deploy them. It is especially this latter use—known as 'inference'—that occurs on a massive scale every day. According to De Vries-Gao, the total energy consumption of AI is therefore significantly underestimated.
In his study, published in the scientific journal Joule, De Vries-Gao highlights a major lack of transparency. Large tech companies such as Google, Microsoft, and OpenAI provide little insight into the energy usage of their AI systems. Google reported in 2022 that AI accounted for 10 to 15 percent of its total energy consumption, but no updated figures have been released since. The European AI Act requires companies to report energy use, but only for the training phase of AI models. The actual use—which causes the majority of energy consumption—falls outside this obligation.
Due to the absence of hard data, De Vries-Gao analyzed the amount of electricity consumed by these AI accelerator modules. Major chip manufacturers like NVIDIA and AMD delivered over five million of these graphics cards powering AI in 2023 and 2024. Based on global chip production capacity and knowledge of the modules in use, he estimates the realistic electricity consumption of just the modules to be between 3 and 5.2 gigawatts. For complete AI systems using these modules, including the cooling systems required in data centers, total energy use could rise to 9.4 gigawatts. For comparison: that’s equivalent to the entire electricity consumption of the Netherlands. With production capacity expected to double in 2025, consumption could rise to 23 gigawatts—potentially making AI one of the largest energy consumers in the global digital infrastructure.
Without better regulations or mandatory reporting, it remains nearly impossible to get a clear picture of AI’s actual energy consumption. According to De Vries-Gao, the rapid growth of this technology conflicts with broader societal goals, such as achieving climate targets and reducing overall energy use. He therefore advocates for greater transparency, enabling governments to create effective policies that align AI development with sustainability. If not, AI risks becoming an invisible, uncontrolled source of energy consumption and CO₂ emissions.
Read the full paper here.