Advertisement
Advertisement
People reflected in a hotel window at the Davos Promenade in Switzerland on January 15. Photo: AP
Opinion
Inside Out
by David Dodwell
Inside Out
by David Dodwell

Let’s talk about how to power AI, the new energy-guzzling industry

  • It’s not just how we can produce and afford the electricity, but also how we ensure we don’t aggravate global warming and squeeze domestic water supplies

For all the excitement focused on the artificial intelligence (AI) revolution, I can’t help but focus on its precarious foundations.

I would like to buy into the excitement over all the good things AI may bring – from enhanced productivity and efficiency, to previously unimaginable medical treatments, safer traffic management, augmented food production and new effective ways of reducing carbon dioxide emissions.
But I am whipsawed by a troubling list of vulnerabilities, from Nvidia supplying 95 per cent of the graphics processing units (GPUs) needed to power AI learning machines, to the near monopolies of ASML in advanced chipmaking machines and Taiwan Semiconductor Manufacturing Company (TSMC) in advanced chip production.
Put on one side the tripwires in the deepening US-China geopolitical conflict, the challenges of agreeing on regulations over such powerful new technologies, and the quirky “hallucinations” produced by tools like ChatGPT. Also put on one side the arguments on whether the new technologies will end up creating or destroying more jobs, or whether AI will widen inequalities between the rich and poor parts of the world.
The worry I am wrestling with has been much less discussed: AI’s voracious appetite for electricity. Not just how we can produce and afford it, but also how we ensure we don’t aggravate global warming and squeeze domestic water supplies as we build the facilities that power the AI revolution.
As OpenAI founder Sam Altman said in January: “We still don’t appreciate the energy needs of this technology.” All we know is that machine learning is extremely energy-intensive. According to The Verge, the training of a large language model like GPT-3 is estimated to require 1,300 megawatt-hours – enough to power 130 US homes for a year or stream 1.6 million hours of Netflix. In 2022, Google said machine learning accounted for 15 per cent of its energy bill over the previous three years.

01:45

Chinese AI-generated cartoon series broadcast on state television

Chinese AI-generated cartoon series broadcast on state television
According to the The Verge and despite the Google disclosure, the main reason we don’t appreciate the electricity problem is simple: “The organisations best placed to produce [the electricity] bill – companies like Meta, Microsoft and OpenAI – simply aren’t sharing the relevant information.”

Dutch academic Alex de Vries is a bit more specific. Basing his calculations on Nvdia’s projected sales of the GPUs that power AI servers, he concludes that the AI sector could be consuming 85-134 terawatt-hours (TWh) of electricity a year by 2027 – equivalent to the entire energy consumption of the Netherlands and about 0.5 per cent of global energy demand.

Calling it “a pretty significant number”, de Vries noted it was “enough foundation to give a bit of a warning”. Researcher Sasha Luccioni from AI company Hugging Face shares de Vries’ concerns: “The generative AI revolution comes with a planetary cost that is completely unknown to us.”

How AI development has fostered a digital ‘sweatshop’ in poor countries

Well, not quite “completely”. The International Energy Agency (IEA), focusing on the worldwide explosion of the data centres needed to feed the AI revolution, says the more than 8,000 centres globally (33 per cent in the US, 16 per cent in the EU and 10 per cent in China) consumed 460TWh of energy in 2022, and that could rise to more than 1,000TWh by 2026, matching the annual consumption in Germany.
Whereas a cloud data centre typically covers around 100,000 sq ft, there has been an explosion of “hyperscale” data centres as large as 1-2 million sq ft each, capable of supporting large-scale users such as Amazon with its store interfaces, Apple with its TV video services, and cryptocurrency miners.
A typical cloud data centre typically covers around 100,000 sq ft, but there has been an explosion of “hyperscale” data centres as large as 1-2 million sq ft each. Photo: Shutterstock

These data centres have not just an insatiable appetite for electricity but also need huge quantities of water to keep their processors cool. For example, Google, which has data centre facilities in The Dalles in Oregon, consumes 29 per cent of the city’s total water supplies.

The IEA, which expects global electricity demand to rise from 28,000TWh last year to 30,000TWh in 2026, sees data centres (for cryptocurrency mining as well as for AI), electric vehicles and heat pumps as the main future growth drivers of electricity demand.

On the climate change front, it’s not all doom and gloom

The good news is the IEA expects much of the new demand to be met by green, low-emission sources, and that the “hyperscale” data centres will use power much more efficiently than traditional data centres.

But calculating the real future dollar cost of providing enough electricity and then building the AI infrastructure that so many are dreaming about remains frustratingly tough. De Vries notes that if Google were to turn its search engine into something like ChatGPT, its energy use would soar to a level equivalent to power demand in Ireland. But, he said, “it’s not going to happen like that because Google would also have to invest US$100 billion in hardware to make that possible”.

The IEA says power for data centres, cryptocurrencies and AI currently accounts for around 2 per cent of global power demand. If the AI developments of the coming decade were to lift this by a further 2 per cent, my back-of-the-envelope calculations suggest the world’s electricity bills would rise by more than half a trillion dollars.

All this might just be manageable, but the vulnerabilities and mind-numbing costs intrinsic to the development of AI demand careful attention. We must hope that Jesse Dodge, research scientist at Seattle’s Allen Institute for Artificial Intelligence, is right: “AI is an accelerant for everything. It will make whatever you’re developing go faster.”

David Dodwell is CEO of the trade policy and international relations consultancy Strategic Access, focused on developments and challenges facing the Asia-Pacific over the past four decades

Post