Could Neuromorphic Computing Slash AI Energy Needs?

30th July 2024 | AI-TAIBOM

Artificial intelligence is revolutionising many aspects of our lives, but its growing power comes at a massive energy cost. Data centres housing AI systems are gobbling up increasing amounts of electricity, raising concerns about sustainability. According to the New York Times, [1] by 2027 AI servers could use between 85 and 134 terawatt hours of electricity annually – which is around the same as for the whole of Sweden.

At Google, greenhouse gas emissions have increased 48% between 2019 to 2023 [2], despite the company pursing an aggressive net zero policy. This surge is blamed on the increase in data centre power consumption and Google cite that the greater intensity of AI compute will make the situation even more challenging.

Loihi and NorthPole to the rescue?

Neuromorphic computing, a concept pioneered in the late 1980s, is a different approach inspired by human neuron behaviour and may provide a solution to the energy use issue of conventional AI. Traditional computers process information in a mainly linear, step-by-step fashion and must constantly move data from memory to the compute function and back. This is completely different to the way the brain operates, which is massively parallel, and event driven. Neuromorphic systems aim to mimic the brain's structure and function and present exciting potential to slash AI's energy consumption.

The Energy Efficient Brain

The human brain is a computational marvel. With its network of billions of neurons, it can perform complex tasks like recognizing faces and understanding language, all while consuming a mere 20 watts of power – about the same as a light bulb.

Research Shows Dramatic Reductions in Power Consumption

The promise of neuromorphic computing isn't just theoretical. Studies have shown significant reductions in energy use compared to traditional hardware. Research by Intel Labs and TU Graz demonstrated that running large neural networks on a neuromorphic chip called Loihi could achieve a 4 to 16 times reduction in energy consumption [3], whilst IBM’s neural/traditional fusion chip NorthPole is 25 times more energy efficient and 22 times faster than Nvidia’s H100 GPU [4]

How Neuromorphic Computing Achieves Efficiency

Neuromorphic systems achieve their efficiency through several key features:

  • Event-driven processing and spike communications: Unlike traditional computers that constantly move data in a synchronous manner, neuromorphic systems only activate when they detect a change in the data. Communications in neuromorphic neural networks is via sparce spiking electrical signal trains, which more closely mimic the operation of biological neurons interaction. Incoming spikes are weighted and integrated over time as inputs to the activation functions for the neuron output in Spiking Neural Networks (SNN) [5]. This asynchronous behaviour and sparce signalling can significantly reduce power consumption.
  • In-memory computing: Some neuromorphic systems allow computing to be performed within the device memory itself, eliminating the need for data to be constantly shuttled back and forth to a processing unit (the von Neumann bottleneck), which is time and energy inefficient.
  • Exploiting inherent parallelism: The brain processes information in parallel across a vast network of neurons. Neuromorphic systems can mimic this parallelism, allowing for faster and more efficient computation.

 Future capabilities

Neuromorphic computing is still in its early stages, and challenges exist, notably around the speed of learning and backpropagation schemes for discontinuous functions in neural networks, but the potential for dramatic reductions in energy consumption is undeniable. As this technology matures, we can expect to see a new generation of AI systems that are not only powerful but also energy efficient which will pave the way for a more autonomous applications in low power systems. Indeed, Yole Group predict [6] that neuromorphic AI will represent 18% of the computing and sensing AI market by 2035, with a value of around $20bn.

References:

 [1] A.I. Could Soon Need as Much Electricity as an Entire Country - The New York Times (nytimes.com)

[2] Google 2024 Environmental Report - Google Sustainability (gstatic.com)

[3] Energy Efficiency of Neuromorphic Hardware Practically Proven

[4] IBM Research Shows Off New NorthPole Neural Accelerator (forbes.com)

[5] Exploring Neuromorphic Computing Based on Spiking Neural Networks: Algorithms to Hardware (acm.org)

[6] Yole Développement - Neuromorphic Computing and Sensing 2021 - Flyer (yolegroup.com)

Back to news

Newsletter Signup

Keep up to date with our latest news and events.

    Techworkshub Limited, 1 George Square, Glasgow G2 1AL

    Privacy Policy

    Restricted Content

    This content is restricted to registered users. To view the content please either login or register below.

    Login in Register

    Cyber Essentials Accredited

    Follow us

    Restricted Content

    This content is restricted to registered users. To view the content please either login or register below.

    Login in Register