The High Performance, Low Power Promise of Neuromorphic Computing

Before I turn to its wider benefits and potential applications, let me put that energy saving in perspective. Conventional computing technology is based on the so-called von Neumann architecture, where data processing and transfer are carried out intensively and continuously. Next-generation computers are expected to operate at the exascale with 1018 calculations per second. But the downside is power consumption.

Data computation and transfer are responsible for a large part of this consumption, and the rapid development of machine learning and AI neural network models is adding even more demand. As much as 10 megawatts of power could be used for some AI learning algorithms on an exascale computer. Data-centric computing requires a hardware system revolution. The performance of the computing system, in particular the energy efficiency, sets the fundamental limit of AI/ML capability. As for neuromorphic computing? It has the potential to achieve HPC and yet consumes 1/1000th of the energy.

The neuromorphic approach uses silicon artificial neurons to form a spiking neural network (SNN) that performs event-triggered computation. There is a key difference between an SNN and other networks, such as the convolutional neural network (CNN). An SNN is formed by silicon artificial neurons and performs event-triggered computation. Spiking neurons process input information only after the receipt of the incoming spike signal. Spiking neural networks effectively attempt to make the neurons more like real neurons.

The process does not work in discrete time steps. Instead, it takes in events over a time series to help build up signals within the neurons. These signals accumulate inside the neurons until a threshold is passed, at which point it triggers computation operation.

Ultra-low power operation can be achieved thanks to SNNs being effectively in an ‘off’ mode most of the time and only kicking into action when a change, or ‘event’, is detected.

Once in action, it can achieve fast computation without running an energy-consuming fast clock by triggering a huge number of parallel operations (equivalent to 1000s CPU in parallel). So, it consumes only a fraction of the power compared to CPU/GPU for the same workload.

This is why the future of neuromorphic computing is well suited to edge AI – implementing low-power AI on end devices without connecting to the cloud. This is especially so for TinyML applications that tend to focus on battery-operated sensors, IoT devices and so on.

Next-generation neuromorphic systems are expected to have intrinsic capabilities to learn or deal with complex data just as our brain does. It has the potential to process large amounts of digital information with much lower power consumption than conventional processors.

In the medium term, hybrid traditional computers with neuromorphic chips could vastly improve performance over conventional machines. In the longer term, fully neuromorphic computers will be fundamentally different and designed for specific applications, from natural language processing to autonomous driving.

When it comes to design, instead of the conventional architecture of portioning chips into processor and memory, the computer may be built with silicon ‘neurons’ performing both functions.

Building extensive ‘many-to-many’ neuron connectivity will allow an efficient pipeline for signal interaction and facilitate massive parallel operation. There is a trend to develop ever-increasing amounts of electronic neurons, synapses and so on in a single chip.

The design approaches of neuromorphic processor chips broadly follow one of a number of distinct paths. The ASIC-based digital neuromorphic chip offers highly optimised computation performance tailored for application requirements. For AI applications, it can potentially perform both inference and real-time learning.

The FPGA-based chip is similar to ASIC-based digital design but also offers portability and reconfigurability. Due to its highly reconfigurable nature and parallel speed, FPGA is considered to be a suitable platform for mimicking, to some degree, the natural plasticity of biological neural networks.

Analogue neuromorphic chips, which include so-called ‘in-memory-computing’, have the potential to achieve the lowest power consumption. They’d mainly be suited for machine learning inference rather than real-time learning.

The photonic integrated circuit (PIC) based neuromorphic chip offers photonic computation that can achieve very high speed at very low power consumption, while mixed-signal NSoC (Neuromorphic System-on-Chip) design combines extremely low power analogue design for ML inference with digital SNN architecture processor for real-time learning.

I expect that neuromorphic computing will generate development opportunities in several technological areas, such as materials, devices, neuromorphic circuits and new neuromorphic algorithms and software development platforms – all crucial elements for the success of neuromorphic computing.

There are countless potential applications. Applying neuromorphic techniques to vision applications represents a large market opportunity for many different sectors, including smart vision sensors and gesture control applications in smart homes, offices and factories.

Another use case is neuromorphic computing for myoelectric prosthetics control. Myoelectric prosthetics assist people with reduced mobility by sensing and processing muscle spikes. However, inefficiencies must be improved for enhanced user experience, such as increasing the granularity of movement classification and reducing computational resources to decrease energy consumption.

Low-power edge computing represents a key area of high commercial potential. As IoT applications in smart homes, offices, industries and cities proliferate, there is an increasing need for more intelligence on the edge as control is moved from data centres to local devices. Applications such as autonomous robots, wearable healthcare systems, security and IoT all share the common characteristics of battery-operated, ultra-low power, standalone operation.

One potential application that I find particularly fascinating is that of “Parametric Insurance”. With global attention increasingly turning to climate-related issues, this unconventional form of ‘disaster insurance’ is playing an increasingly significant role. It is a product that offers pre-specified pay-outs based on a trigger event – and can help to provide protection when standard policies are harder to get.

For me, the correlation to neuromorphic computing is clear. Parametric Insurance can be tied to a catastrophe bond (CAT) for events such as hurricanes, earthquakes and so on. Neuromorphic powered edge computing has a big role to play as it would allow for very granular and sophisticated risk analysis, adjudication, and payment settlement. All would be at the edge – with an associated low cost.

About the author

Dr Aidong Xu, Head of Semiconductor Capability, Cambridge Consultants

Aidong holds over 30 years of experience across diverse industries, including with some of the leading semiconductor companies. He has managed large internationally based engineering teams and brought innovative industry-leading products into the global market that have achieved rapid and sustained business growth. Aidong holds a PhD. in Power electronics and Power semiconductors.

David Howell

Dave Howell is a freelance journalist and writer. His work has appeared across the national press and in industry-leading magazines and websites. He specialises in technology and business. Read more about Dave on his website: Nexus Publishing. https://www.nexuspublishing.co.uk.

Recent Posts

How Agentic AI Became the Newest Form of Business Investment

Agentic AI is revolutionizing business investment by enabling autonomous, scalable systems that require minimal human…

1 month ago

New EU Law Expands Digital Resilience to Third-Party Dependencies: What is the Impact on Businesses

The EU’s Digital Operational Resilience Act (DORA) sets new standards for financial services, emphasizing digital…

1 month ago

So long, SaaS: Klarna is right, DIY is the Future for AI-Enabled Businesses

Klarna’s bold decision to abandon SaaS giants like Salesforce and Workday signals a major shift…

1 month ago

Demystifying AI Models: How to Choose the Right Ones

Large Language Models (LLMs) have revolutionized artificial intelligence, transforming how businesses interact with and generate…

2 months ago

Beyond CISO Scapegoating: Cultivating Company-Wide Security Mindsets

In the evolving cybersecurity landscape, the role of the Chief Information Security Officer (CISO) has…

2 months ago

Three Key Considerations for Companies Implementing Ethical AI

Artificial Intelligence (AI) has grown exponentially, transforming industries worldwide. As its use cases expand, concerns…

2 months ago