These Scientists are Making a New Brain with Faster Neurons



MIT researchers have created new artificial “neurons” and “synapses” that are faster than the normal one

Researchers from the Massachusetts Institute of Technology (MIT) have created new artificial “neurons” and “synapses” that exist within a new field of artificial intelligence called analog deep learning. Instead of using transistors like in digital processors, analog deep learning uses programmable resistors to “create a network of analog artificial ‘neurons’ and ‘synapses'” that can exceed the performance of a digital neural network, while using a fraction of the energy.

The MIT team’s artificial neurons and synapses are built using a new inorganic material in their fabrication process, boosting the performance of devices using them to one million times faster than previous iterations and one million times faster than the synapses found in the human brain. The new material can also be used with existing silicon fabrication techniques, meaning it can be used to create nanometer-scale devices and potentially integrate the technology with existing computing hardware to facilitate deep-learning applications.

“Once you have an analog processor, you will no longer be training networks everyone else is working on. You will be training networks with unprecedented complexities that no one else can afford to, and therefore vastly outperform them all. In other words, this is not a faster car, this is a spacecraft,” said lead author and MIT postdoc Murat One.

“The speed certainly was surprising. Normally, we would not apply such extreme fields across devices, to not turn them into ash. But instead, protons ended up shuttling at immense speeds across the device stack, specifically a million times faster compared to what we had before. And this movement doesn’t damage anything, thanks to the small size and low mass of protons. It is almost like teleporting. The nanosecond timescale means we are close to the ballistic or even quantum tunneling regime for the proton, under such an extreme field,” said senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and professor of materials science and engineering.

Nanoscale ionic programmable resistors for analog deep learning are 1000 times smaller than biological cells, but it is not yet clear how much faster they can be relative to neurons and synapses. Scaling analyses of ionic transport and charge-transfer reaction rates point to operation in the nonlinear regime, where extreme electric fields are present within the solid electrolyte and its interfaces. In this work, we generated silicon-compatible nanoscale protonic programmable resistors with highly desirable characteristics under extreme electric fields. This operation regime enabled controlled shuttling and intercalation of protons in nanoseconds at room temperature in an energy-efficient manner. The devices showed symmetric, linear, and reversible modulation characteristics with many conductance states covering a 20× dynamic range. Thus, the space-time-energy performance of the all-solid-state artificial synapses can greatly exceed that of their biological counterparts.


Accelerating Deep Learning

Analog deep learning is faster and more energy-efficient than its digital counterpart for two main reasons. “First, computation is performed in memory, so enormous loads of data are not transferred back and forth from memory to a processor.” Analog processors also conduct operations in parallel. If the matrix size expands, an analog processor doesn’t need more time to complete new operations because all computation occurs simultaneously.

The key element of MIT’s new analog processor technology is known as a protonic programmable resistor. These resistors, which are measured in nanometers (one nanometer is one billionth of a meter), are arranged in an array, like a chess board.

In the human brain, learning happens due to the strengthening and weakening of connections between neurons, called synapses. Deep neural networks have long adopted this strategy, where the network weights are programmed through training algorithms. In the case of this new processor, increasing and decreasing the electrical conductance of protonic resistors enables analog machine learning.


Must see news