Newswise — WASHINGTON, March 24, 2020 -- During the 1990s, Carver Mead and colleagues combined basic research in neuroscience with elegant analog circuit design in electronic engineering. This pioneering work on neuromorphic electronic circuits inspired researchers in Germany and Switzerland to explore the possibility of reproducing the physics of real neural circuits by using the physics of silicon.

The field of “brain-mimicking” neuromorphic electronics shows great potential not only for basic research but also for commercial exploitation of always-on edge computing and “internet of things” applications.

In Applied Physics Letters, from AIP Publishing, Elisabetta Chicca, from Bielefeld University, and Giacomo Indiveri, from the University of Zurich and ETH Zurich, present their work to understand how neural processing systems in biology carry out computation, as well as a recipe to reproduce these computing principles in mixed signal analog/digital electronics and novel materials.

One of the most distinctive computational features of neural networks is learning, so Chicca and Indiveri are particularly interested in reproducing the adaptive and plastic properties of real synapses. They used both standard complementary metal-oxide semiconductor (CMOS) electronic circuits and advanced nanoscale memory technologies, such as memristive devices­, to build intelligent systems that can learn.

This work is significant, because it can lead to a better understanding of how to implement sophisticated signal processing using extremely low-power and compact devices.

Their key findings are that the apparent disadvantages of these low-power computing technologies, mainly related to low precision, high sensitivity to noise and high variability, can actually be exploited to perform robust and efficient computation, very much like the brain can use highly variable and noisy neurons to implement robust behavior.

The researchers said it is surprising to see the field of memory technologies, typically concerned with bit-precise high-density device technologies, now looking at animal brains as a source of inspiration for understanding how to build adaptive and robust neural processing systems. It is very much in line with the basic research agenda that Mead and colleagues were following more than 30 years ago.

“The electronic neural processing systems that we build are not intended to compete with the powerful and accurate artificial intelligence systems that run on power-hungry large computer clusters for natural language processing or high-resolution image recognition and classification,” said Chicca.

In contrast, their systems “offer promising solutions for those applications that require compact and very low-power (submilliwatt) real-time processing with short latencies,” Indiveri said.

He said examples of such applications fall within “the ‘extreme-edge computing’ domain, which require a small amount of artificial intelligence to extract information from live or streaming sensory signals, such as for bio-signal processing in wearable devices, brain-machine interfaces and always-on environmental monitoring.”

###

The article, “A recipe for creating ideal hybrid memristive-CMOS neuromorphic computing systems,” is authored by Elisabetta Chicca and Giacomo Indiveri. It will appear in Applied Physics Letters, March 24, 2020 (DOI: 10.1063/1.5142089). After that date, it can be accessed at https://aip.scitation.org/doi/10.1063/1.5142089.

ABOUT THE JOURNAL

Applied Physics Letters features rapid reports on significant discoveries in applied physics. The journal covers new experimental and theoretical research on applications of physics phenomena related to all branches of science, engineering, and modern technology. See https://aip.scitation.org/journal/apl.

###

Journal Link: Applied Physics Letters