Time we had a thread on this - accepting that even discussing it uses / mis-uses energy!
From article from Pat Kane at weekend. ©National
The most glaring for me is the model of artificial intelligence at the heart of this boosterism, which is particularly electricity-hungry – and which may itself not last long.The reasoning and creativity shown by chat-bots and agents are what excites governments and corporations about this technology (though maybe not so much excitement among the service and knowledge workers they’ll replace).
But these AIs are enabled by training their software on vast knowledge archives, in which the artificial entity seeks the plausible patterns that answer your query. (LLMs, as these AIs are called, stands for “large language model”). To mine and map this material entails a massive and constant whirring of servers and algorithms. Thus the exponential demand for more electricity, powering the “compute” these large language models require.
Yet what if that LLM model profoundly changed? Consider this. While our current AIs take multi-millions of watts to emulate the mental powers of a PhD student, a human brain takes 20 watts of energy to perform the same tasks, at the same level. (Biological evolution isn’t too shabby, sometimes…)
As it happens, there is a developing model of AI, called “neuromorphic” computing. This aims to directly mimic the way that information and memory are smoothly integrated in the human brain. Intelligent software currently has to “jump between” information and memory, incurring a giant energy cost. (It’s called “the Von Neumann bottleneck”, if you want to nerd out)
What if a burst of discovery and application brought AI models much closer to the energy-efficiency levels of the human brain?
The future obsessions which we see with heavyweight EVs and electric planes are only part of the story.
E-bikes of course combine our natural power so are 'sort of' ok? (My bias of course).