With all the attention it has had, you’d be forgiven for thinking ChatGPT marks the pinnacle of AI development.
But as experts push the state of the art, the need for more processing grunt is fuelling excitement about ‘neuromorphic computing’ chips that work like brains.
Whereas conventional computing involves the representation of data as 0s and 1s – which can be stored in any medium whose state can be switched between on and off – neuromorphic computers are designed around the concepts of neurons and synapses.
In biology, these two elements are crucial to moving electrical signals within the brain, where it can process stimuli from nerves that sense the body’s functioning and the senses that enable interaction with the outside world.
When you touch a pin, for example, nerves trigger electrical impulses that are sent to the brain and processed as pain. Push harder, and the impulses are larger and perceived as more painful.
Anaesthetics cause numbness by temporarily impeding the function of these nerves so that pain signals are not transmitted as usual, while neurodegenerative diseases like Parkinson’s disease are caused when their degradation impairs the transmission of signals to and from the brain.
Last year, researchers announced they had built on these concepts to produce an electronic skin that can let robots ‘feel’ pain – an example of neuromorphic computing that highlights how its structures can detect a range of values, not the simple on or off of conventional digital systems.
This is possible because neuromorphic systems use not bits but ‘spikes’ – the harder you prick your finger with the pin, the larger the spike produced – that allow them to respond more like biological elements. And that, in turn, has researchers wondering how this architecture could be a game-changer for AI.
Building the biological processor
Neuronal and synaptic structures have been well explored within the scientific world, but the idea that they might provide a different way of processing information within IT systems has built neuromorphic computing into a real and significant field of computing.
Years ago, IBM researchers used silicon to emulate those two basic architectural elements, packing one million neurons and 256 million synapses into a neuromorphic chip called TrueNorth that has proven to be extremely efficient in applications such as computer vision.
Brain activity uses far lower power than conventional computers, which means even back then TrueNorth was capable of processing 1,200 to 2,600 images per second while using just 25 to 275 milliwatts of power.
That would allow a single TrueNorth chip to process real-time video feeds from 50 to 100 cameras while running for days on the power stored in a single smartphone battery – an efficiency that is impossible using conventional computer processors.
The release of the chip “provides a palpable proof-of-concept that the efficiency of brain-inspired computing can,” IBM Research fellow and chief scientist for brain-inspired computing Dharmendra Modha said at the time, “be merged with the effectiveness of deep learning, paving the path towards a new generation of cognitive computing spanning mobile, cloud, and supercomputers.”
Fast forward to 2023, where ChatGPT has stormed the world of computing and vendors are falling over themselves to link their systems to OpenAI’s back-end GPT-3.5 and GPT-4 large language models (LLMs).
Those LLMs run on massive banks of computers – ChatGPT was trained on a bank of 10,000 Nvidia graphical processing units (GPUs) alone, according to reports that suggest the system will need to scale up rapidly as demand continues to explode.
Yet that scaling has other implications – both in terms of their significant cost and their expanding power consumption.
If neuromorphic computing can deliver high-speed processing at a fraction of the power consumption, as IBM long ago demonstrated, it could well be a game-changer as millions of businesses integrate generative AI platforms into the core of their compute environments.
More AI power at lower energy consumption
Rebuilding massive computing systems using neuromorphic computing is a major environmental imperative, according to a 2021 report from UK-based eFutures network that flagged neuromorphic computing’s potential to wind back the computing world’s insatiable appetite for electricity.
“Moving to a lower-power way of computing, inspired by the human brain, is key to making computing greener and more sustainable,” noted Professor Roger Woods, whose work in the Queen’s University Belfast School of Electronics, Electrical Engineering and Computer Science focuses on areas such as embedded image processing, security, and AI.
“It could help to lower greenhouse emissions and avoid water shortages in future,” he said, noting that the human brain “consuming 20 watts can outperform a supercomputer consuming many kilowatts…. [understanding why] is paramount to ensuring that computing is capable of functioning properly in a modern, high-tech world.”
Spurred on by the significant demands of the AI explosion, scientists are intensifying their interest in neuromorphic computing, with efforts such as the EU’s Human Brain Project tapping both a conventional supercomputer called SpiNNaker and a neuromorphic system called BrainScaleS-2 – which includes 50 million synapses and 200,000 “biologically realistic neurons” that, the project’s authors note, “do not execute pre-programmed code but evolves according to the physical properties of the electronic devices.”
Projects such as NimbleAI, a EU-funded consortium of neuromorphic researchers at 19 member organisations, are focused on improving the energy efficiency and performance of neuromorphic chips with an eye on making them upgradeable as AI technologies improve – benefiting applications such as autonomous vehicles, which rely heavily on AI to quickly process and react to objects around them.
Late last year, just around the time ChatGPT was welcoming its first users, Intel also launched its latest effort in neuromorphic computing – a chip called Loihi 2 that Queensland University of Technology robotics researchers, among others, are already using to build world-aware computing architectures and robotic devices.