http://www.eetimes.com/document.asp?doc_id=1328806

Russians’ Report Memristors

LAKE WALES, Fla.–The perceptron invented by Frank Rosenblatt in 1957 and popularized in Marvin Minsky and Seymour Papert’s 1969 book Perceptrons: An Introduction to Computational Geometry may no longer be just theory.

Russian and Italian scientists, led by Vyacheslav Demin at the Moscow Institute of Physics and Technology and the National Research Center Kurchatov Institute (Moscow) have described a perceptron in detail in a paper by Demin and his collaborators titledHardware elementary perceptron based on polyaniline memristive devices.

Marvin Minsky passed away on Sunday (January 24).

The most exciting element of their perceptron was the polymer-memristor they constructed from organic polyaniline (PANI)—a highly conductive polymer that has been used as the active electronic component in experimental non-volatile memories. It was created by Demin’s team which also included scientists from the Kurchatov Institute (Moscow), the Moscow Institute of Physics and Technology (MIPT), the University of Parma (Italy), Moscow State University, and Saint Petersburg State University. The experiments were performed at the Complex for Nano-, Bio-, Information, Cognitive and Socio-Humanitarian Sciences and Technologies (NBICS) at the Kurchatov Institute.

The Russian memristor is fabricated from polyaniline and has already been proven capable of realizing Marvin Minsky's Perceptron.
(Source: Moscow Institute of Physics and Technology)

The Russian memristor is fabricated from polyaniline and has already been proven capable of realizing Marvin Minsky’s Perceptron.
(Source: Moscow Institute of Physics and Technology)

“The physical realization of an elementary perceptron demonstrates the ability to form the hardware-based neuromorphic networks with the use of organic memristive devices,” said Demin’s team in their abstract. “The results provide a great promise toward new approaches for very compact, low-volatile and high-performance neurochips that could be made for a huge number of intellectual products and applications.”

The scientists aim to use their polymeric memristors in multi-layer perceptron—called deep-learning neural networks or just neuromorphic networks—in applications ranging from machine vision, hearing, and other sensory modes, for intelligent control systems and autonomous robots.

Hows it works
Polyaniline memristors have been demonstrated before, individually and for non-volatile memories, but the Russian and Italian scientists claim their implementation is the first that formed into a genuine analog neural network–a single layer perceptron. Their memristors were fabricated at the millimeter scale for convenience, using a polyaniline solution, a glass substrate, and chromium electrodes, but the researchers claim that within five years they could be manufacturable at 10-nanometers–rivaling silicon chips.

The memristor prototype is still quite large (the coin is about half the size of a U.S. penny) but the researchers say it can be downsized to 10 nanometers.
(Source: Moscow Institute of Physics and Technology)

The memristor prototype is still quite large (the coin is about half the size of a U.S. penny) but the researchers say it can be downsized to 10 nanometers.
(Source: Moscow Institute of Physics and Technology)

When characterizing the polyaniline memristors, they found that they had a natural hysteresis built-in, a very desirable quality for digital non-volatile memories. For analog applications, the hysteresis was found by the researchers to be mild enough to enable memristors to operate in the middle analog range of the total hysteresis curve. As a result, the polyaniline memristors were able to emulate the function of the brain’s synapses between its neurons–that is become more conductive the more they are used, and atrophying down to zero when current flows in the opposite direction.

To prove their point, the researchers trained the perceptron to learn both the digital NAND and NOR functions, as well as other linear separable operations, invariant pattern recognition and linear approximations. A standard back-propagating error-correction algorithm–which sends error signals backwards to reduce the conductivity of the memristive synapse, allowed the neural network to learn, giving the researchers hope that multi-layer versions in the future will be able to emulate deep-learning tasks at a much higher speed than they are simulated on digital computers today.

Next, beside downsizing to the nanoscale, the researchers also intend to implement multi-layer deep learning neural networks using the third dimension–stacking network layers vertically into 3-D structures.

— R. Colin Johnson, Advanced Technology Editor, EE Times Circle me on Google+

Leave a comment