Neuromorphic chips 'up to 16 times more energy efficient' for deep learning

Graz University findings will become more significant as more AI work is done


Neuromorphic chips have been endorsed in research showing that they are much more energy efficient at operating large deep learning networks than non-neuromorphic hardware.

This may become important as AI adoption increases.

The study was carried out by the Institute of Theoretical Computer Science at the Graz University of Technology (TU Graz) in Austria using Intel's Loihi 2 silicon, a second-generation experimental neuromorphic chip announced by Intel Labs last year that has about a million artificial neurons.

Their research paper, "A Long Short-Term Memory for AI Applications in Spike-based Neuromorphic Hardware," published in Nature Machine Intelligence, claims that the Intel chips are up to 16 times more energy efficient in deep learning tasks than performing the same task on non-neuromorphic hardware. The hardware tested consisted of 32 Loihi chips.

While it may not seem surprising that specialized hardware would be more efficient at deep learning tasks, TU Graz claims this is the first time it has been demonstrated experimentally.

According to TU Graz, this is important because such deep learning models are the subjects of worldwide artificial intelligence research with a view to being deployed in real-world applications. However, the energy consumption of the hardware required to operate the models is a major obstacle on the path to a broader application of such systems.

This is also pointed out in another paper – "Brain-inspired computing needs a master plan," published in Nature – in which the authors point out that "the astonishing achievements of high-end AI systems such as DeepMind's AlphaGo and AlphaZero, require thousands of parallel processing units, each of which can consume around 200 watts."

"Our system is four to 16 times more energy-efficient than other AI models on conventional hardware," said Philipp Plank, a doctoral student at TU Graz's Institute of Theoretical Computer Science, who added that further efficiency gains are likely with the next generation of Loihi hardware.

In the TU Graz report, which was funded by Intel and The Human Brain Project, the researchers worked with algorithms that involve temporal processes. One example given is the system answering questions about a previously told story or grasping the relationships between objects or people from the context.

In this respect, the model was mimicking human short-term memory, or at least a presumed memory mechanism thought to be employed in the human brain. The researchers linked two types of deep learning networks – feedback neural networks responsible for short-term memories, and a feed-forward network – to determine which of the relationships found are important for solving the task at hand.

Mike Davies, director of Intel's Neuromorphic Computing Lab, said that neuromorphic hardware like the Loihi chips is well suited for the fast, sparse, and unpredictable patterns of network activity observed in the brain and needed for the most energy efficient AI applications.

"Our work with TU Graz provides more evidence that neuromorphic technology can improve the energy efficiency of today's deep learning workloads by re-thinking their implementation from the perspective of biology," he said.

Alan Priestley, Gartner vice president for Emerging Technologies & Trends, agreed that neuromorphic chips have the potential to be widely adopted, in part thanks to their low power requirements.

"Given the challenges the current AI chips designs have in delivering the necessary performance within reasonable power envelopes, new architectures such as neuromorphic computing will be required and we are already seeing a number of startups developing neuromorphic chip designs for extremely low power endpoint designs – including being integrated onto sensor modules and into event based cameras," he told us.

According to Intel, its neuromorphic chip technology could at some point be integrated into a CPU to add energy-efficient AI processing to systems, or access to neuromorphic chips may be made available as a cloud service. ®

Broader topics


Other stories you might like

  • Meta: We need 5x more GPUs to combat TikTok, stat
    And 30% fewer new engineers this year

    Comment Facebook parent Meta has reportedly said it needs to increase its fleet of datacenter GPUs fivefold to help it compete against short-form video app and perennial security concern TikTok.

    The oft-controversial tech giant needs these hardware accelerators in its servers by the end of the year to power its so-called discovery engine that will become the center of future social media efforts, according to an internal memo seen by Reuters that was written by Meta Chief Product Officer Chris Cox.

    Separately, CEO Mark Zuckerberg told Meta staff on Thursday in a weekly Q&A the biz had planned to hire 10,000 engineers this year, and this has now been cut to between 6,000 and 7,000 in the shadow of an economic downturn. He also said some open positions would be removed, and pressure will be placed on the performance of those staying at the corporation.

    Continue reading
  • Is computer vision the cure for school shootings? Likely not
    Gun-detecting AI outfits want to help while root causes need tackling

    Comment More than 250 mass shootings have occurred in the US so far this year, and AI advocates think they have the solution. Not gun control, but better tech, unsurprisingly.

    Machine-learning biz Kogniz announced on Tuesday it was adding a ready-to-deploy gun detection model to its computer-vision platform. The system, we're told, can detect guns seen by security cameras and send notifications to those at risk, notifying police, locking down buildings, and performing other security tasks. 

    In addition to spotting firearms, Kogniz uses its other computer-vision modules to notice unusual behavior, such as children sprinting down hallways or someone climbing in through a window, which could indicate an active shooter.

    Continue reading
  • Amazon can't channel the dead, but its deepfake voices take a close second
    Megacorp shows Alexa speaking like kid's deceased grandma

    In the latest episode of Black Mirror, a vast megacorp sells AI software that learns to mimic the voice of a deceased woman whose husband sits weeping over a smart speaker, listening to her dulcet tones.

    Only joking – it's Amazon, and this is real life. The experimental feature of the company's virtual assistant, Alexa, was announced at an Amazon conference in Las Vegas on Wednesday.

    Rohit Prasad, head scientist for Alexa AI, described the tech as a means to build trust between human and machine, enabling Alexa to "make the memories last" when "so many of us have lost someone we love" during the pandemic.

    Continue reading

Biting the hand that feeds IT © 1998–2022