Energy Efficiency of Neuromorphic Hardware Practically Proven – HPCwire

by | May 24, 2022 | Energy

May 24, 2022 — Neuromorphic technology is more energy efficient for large deep learning networks than other comparable AI systems. This is shown by experiments conducted in a collaboration between researchers working in the Human Brain Project (HBP) at TU Graz and Intel, using a new Intel chip that uses neurons similar to those in the brain.
Smart machines and intelligent computers that can autonomously recognize and infer objects and relationships between different objects are the subject of worldwide artificial intelligence (AI) research. Energy consumption is a major obstacle on the path to a broader application of such AI methods. It is hoped that neuromorphic technology will provide a push in the right direction. Neuromorphic technology is modeled on the human brain, which is the world champion in energy efficiency. To process information, its hundred billion neurons consume only about 20 watts, not much more energy than an average energy-saving light bulb.
A close-up shows an Intel Nahuku board, each of which contains eight to 32 Intel Loihi neuromorphic research chips. Intel’s latest neuromorphic computing system, Pohoiki Springs, was unveiled in March 2020. It is made up of 24 Nahuku boards with 32 chips each, integrating a total of 768 Loihi chips. Credit: Tim Herman/Intel Corporation
A research team from HBP partner TU Graz and Intel has now demonstrated experimentally for the first time that a large neuronal network on neuromorphic hardware consumes considerably less energy than non-neuromorphic hardware. The results have been published in Nature Machine Intelligence. The group focused on algorithms that work with temporal processes. For example, the system had to answer questions about a previously told story and grasp the relationships between objects or people from the context. The hardware tested consisted of 32 Loihi chips (note: Loihi is the name of Intel’s neuronal research chip). “Our system is two to three times more economical here than other AI models,” says Philipp Plank, a doctoral student at TU Graz’s Institute of Theoretical Computer Science and an employee at Intel.
Plank holds out …

Article Attribution | Read More at Article Source

Share This