Study Claims AI Is An Energy Hog Consuming As Much Electricity As A Country

hero artificial intelligence 2167835 1920
A researcher out of Amsterdam is asserting in his recent study that our love and growing dependence on all things AI (artificial intelligence) will draw enough electricity to power a country the size of Sweden in less than five years.

Alex de Vries from the VU Amsterdam School of Business and Economics, in his peer-reviewed paper, claims that possible energy impacts by large language models (LLMs) begin even before they are publicly deployed. LLMs such as GPT-3, Gopher, and Open Pre-trained Transformer (OPT), just to name a few, are reported to use 1,287, 1,066, and 324 MWh respectively during their training phase. Typical LLM training involves terabytes of data with more than 175 billion parameters.

Once deployed for public use, in what is called the inference phase—where LLMs generate outputs based on data—is where De Vries reveals some humbling yet unsurprising power consumption figures. De Vries, also the founder of research company Digiconomist, said that on average AI servers could use between 85 to 134 terawatt hours (TWh) per year by 2027. In four years, those servers will be pulling as much power as what Argentina, Netherlands, and Sweden respectively use per year.

With generative AI and LLMs trending towards a possibly permanent place in society, corporations will continue to build more data centers powered by power-hungry AI chips, thus increasing AI's energy and environmental footprint. Besides the power issue, this sudden surge in demand in generative AI has sent the global supply chain into a frenzy, partly due to the necessary, but destructive mining of minerals to manufacture more AI chips, such as NVIDIAs A100/H100 GPUs.

De Vries ultimately stresses the need for users, tech companies, and AI software/hardware developers to take steps to help reduce this potentially massive power dependency. The solution could lie in a combination of creating more efficient AI-training models, optimizing hardware, reusing old crypto-mining GPUs, and greater reliance of renewable energy sources.