This article is more than 1 year old
Neuromorphic chips 'up to 16 times more energy efficient' for deep learning
Graz University findings will become more significant as more AI work is done
Neuromorphic chips have been endorsed in research showing that they are much more energy efficient at operating large deep learning networks than non-neuromorphic hardware.
This may become important as AI adoption increases.
The study was carried out by the Institute of Theoretical Computer Science at the Graz University of Technology (TU Graz) in Austria using Intel's Loihi 2 silicon, a second-generation experimental neuromorphic chip announced by Intel Labs last year that has about a million artificial neurons.
Their research paper, "A Long Short-Term Memory for AI Applications in Spike-based Neuromorphic Hardware," published in Nature Machine Intelligence, claims that the Intel chips are up to 16 times more energy efficient in deep learning tasks than performing the same task on non-neuromorphic hardware. The hardware tested consisted of 32 Loihi chips.
While it may not seem surprising that specialized hardware would be more efficient at deep learning tasks, TU Graz claims this is the first time it has been demonstrated experimentally.
According to TU Graz, this is important because such deep learning models are the subjects of worldwide artificial intelligence research with a view to being deployed in real-world applications. However, the energy consumption of the hardware required to operate the models is a major obstacle on the path to a broader application of such systems.
This is also pointed out in another paper – "Brain-inspired computing needs a master plan," published in Nature – in which the authors point out that "the astonishing achievements of high-end AI systems such as DeepMind's AlphaGo and AlphaZero, require thousands of parallel processing units, each of which can consume around 200 watts."
"Our system is four to 16 times more energy-efficient than other AI models on conventional hardware," said Philipp Plank, a doctoral student at TU Graz's Institute of Theoretical Computer Science, who added that further efficiency gains are likely with the next generation of Loihi hardware.
In the TU Graz report, which was funded by Intel and The Human Brain Project, the researchers worked with algorithms that involve temporal processes. One example given is the system answering questions about a previously told story or grasping the relationships between objects or people from the context.
In this respect, the model was mimicking human short-term memory, or at least a presumed memory mechanism thought to be employed in the human brain. The researchers linked two types of deep learning networks – feedback neural networks responsible for short-term memories, and a feed-forward network – to determine which of the relationships found are important for solving the task at hand.
Mike Davies, director of Intel's Neuromorphic Computing Lab, said that neuromorphic hardware like the Loihi chips is well suited for the fast, sparse, and unpredictable patterns of network activity observed in the brain and needed for the most energy efficient AI applications.
"Our work with TU Graz provides more evidence that neuromorphic technology can improve the energy efficiency of today's deep learning workloads by re-thinking their implementation from the perspective of biology," he said.
- Intel's neurochips could one day end up in PCs or a cloud service
- MIT, Amazon, TSMC, ASML and friends to work on non-planet-killing AI hardware
- DARPA nails cash to project 'FENCE' — a smart camera that only sends pics when pixels change
- Dear chip designers: It will no longer cost you an Arm and a leg to use these CPU cores (well, not at first, anyway...)
Alan Priestley, Gartner vice president for Emerging Technologies & Trends, agreed that neuromorphic chips have the potential to be widely adopted, in part thanks to their low power requirements.
"Given the challenges the current AI chips designs have in delivering the necessary performance within reasonable power envelopes, new architectures such as neuromorphic computing will be required and we are already seeing a number of startups developing neuromorphic chip designs for extremely low power endpoint designs – including being integrated onto sensor modules and into event based cameras," he told us.
According to Intel, its neuromorphic chip technology could at some point be integrated into a CPU to add energy-efficient AI processing to systems, or access to neuromorphic chips may be made available as a cloud service. ®