“LLM” or large language model. That’s the technology behind ChatGPT - Problem for the Grid - Elnätet

LLMs read documents then calculate, letter by letter, word by word, how bits of information usually go together. 

This requires many millions of computations.

Note also, the power isn’t just for the microchips. Running so many calculations so quickly generates heat. The disposing of it requires cooling systems which consume yet more power (and sometimes massive amounts of water—another not-cheap resource).

You begin to see the challenge. AI generates enormous new energy demand on top of everything else. This is pure growth, not the replacement for something that will go away—at least not yet.

It’s hard to get to Net Zero when we keep inventing new technologies that consume ever larger amounts of energy. The goalposts keep moving.

We are also shifting some energy demand from direct burning (gasoline, heating oil) to electric vehicles and heating systems. The amount consumed may not change much but it puts more strain on the electric grid. 

US electricity demand growth has tripled since a decade ago.



AI and crypto aren’t causing all this demand growth but they are substantial parts of it. This chart shows past and projected data center energy usage, compiled by McKinsey last year.



John Mauldin 22 March 2024



Kommentarer

Populära inlägg i den här bloggen

Röd Öppning - Red Opening

Niklas Ekdal, bunkergängets apologet

Tickande bomben i Heimstaden AB