IBM Creates Programming Model For Brain-Like Computing

brain computing - Shutterstock - © agsandrew

Super smart IBM researchers are creating a high level language that will let anyone build brain-like applications

IBM has unveiled a whole new programming paradigm as it bids to become the leader in cognitive computing.

Big Blue researchers believe for brain-like computing to become a reality, new programming languages and architectures are required. IBM is out to create them, in a project funded by the US Defense Advanced Research Projects Agency (DARPA), which wins our award for Acronym of the Week: it is called SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics).

The plan is to create “a network of neurosynaptic processor cores, which, like the brain, is modular, parallel, distributed, fault-tolerant, event-driven, and scalable,” said Dr Dharmendra Modha, principal investigator and senior manager at IBM Research

IBM logo © Tomasz Bidermann ShutterstiockHaving already shown off a network of billions of “neurosynaptic” processor cores running on IBM’s largest supercomputer, Sequoia, it has now unveiled the “corelet model” of programming.

Corelet Programming

“It’s a high-level description of a software program that is based on re-usable building blocks of code – the corelets,” Modha explained, in a blog post.

“Each corelet represents a method for getting something done using the combination of computation (neuron), memory (synapses), and communication (axons) on individual neurosynaptic processor cores along with inter-core connectivity.”

The corelets on their own do simple small tasks, but they can combine to carry out different functions.

Modha hopes the new programming model will have the same impact as FORTRAN did on high-level programming, making it easy for anyone to create “sophisticated cognitive applications”.

IBM has also produced a software simulator of a cognitive computing architecture,  comprising a network of neurosynaptic cores. “We have demonstrated the simulation of the new architecture at unprecedented scale of 1014synapses,” Modha noted.

And it has created a “neuron model”, made up of equations that can carry out “a wide variety of computational functions and neural codes and can qualitatively replicate the 20 biologically-relevant behaviours of a dynamical neuron model”, Modha  said.

The research team produced a library of software designs that can use this kind of massively parallel computing.

The ultimate goal is to create a chip system with 10 billion neurons and 100 trillion synapses, consuming just one kilowatt of power. This sounds a lot, but it still only  has one tenth of the human brain’s 100 billion neurons, and the number of synapses is at the low end of the capacity of the human brain which starts out with a quadrillion synapses, and declines in adults to between 100 and 500 trillion.

How much do you know about IBM? Try our quiz!