online gambling singapore online gambling singapore online slot malaysia online slot malaysia mega888 malaysia slot gacor live casino malaysia online betting malaysia mega888 mega888 mega888 mega888 mega888 mega888 mega888 mega888 mega888 ANALOG DEEP LEARNING PAVES THE WAY FOR ENERGY-EFFICIENT AND FASTER COMPUTING

摘要: Analog deep learning, a new branch of artificial intelligence, promises quicker processing with less energy use. The amount of time, effort, and money needed to train ever more complex neural network models is soaring as researchers push the limits of machine learning.

 


Similar to how transistors are the essential components of digital processors, programmable resistors are the fundamental building blocks of analog deep learning. Researchers have developed a network of analog artificial “neurons” and “synapses” that can do calculations similarly to a digital neural network by repeatedly repeating arrays of programmable resistors in intricate layers. Then, this network may be trained using difficult AI tasks like image recognition and natural language processing.

WHAT IS THE GOAL OF THE RESEARCH TEAM?

The goal of an interdisciplinary MIT research team was to increase the speed of a particular kind of artificial analog synapse they had previously created. They used a useful inorganic substance in the manufacturing process to give their devices a speed boost of a million times over earlier iterations, roughly a million times faster than the synapses in the human brain.

This inorganic component also contributes to the resistor’s exceptional energy efficiency. The new material is compatible with silicon production methods, in contrast to materials employed in the earlier iteration of their device. This modification has made it possible to fabricate nanometer-scale devices and may open the door to their incorporation into commercial computing hardware for deep-learning applications.

images/20220802_2-1.jpg

▲圖片來源:dataconomy

Similar to how transistors are the essential components of digital processors, programmable resistors are the fundamental building blocks of analog deep learning

“With that key insight and the very powerful nanofabrication techniques we have at MIT.nano, we have been able to put these pieces together and demonstrate that these devices are intrinsically very fast and operate with reasonable voltages. This work has really put these devices at a point where they now look really promising for future applications,” explained the senior author Jesús A. del Alamo, the Donner Professor in MIT’s Department of Electrical Engineering and Computer Science (EECS).

“The working mechanism of the device is the electrochemical insertion of the smallest ion, the proton, into an insulating oxide to modulate its electronic conductivity. Because we are working with very thin devices, we could accelerate the motion of this ion by using a strong electric field and push these ionic devices to the nanosecond operation regime,” stated senior author Bilge Yildiz, the Breene M. Kerr Professor in the departments of Nuclear Science and Engineering and Materials Science and Engineering.

“The action potential in biological cells rises and falls with a timescale of milliseconds since the voltage difference of about 0.1 volt is constrained by the stability of water. Here we apply up to 10 volts across a special solid glass film of nanoscale thickness that conducts protons without permanently damaging it. And the stronger the field, the faster the ionic devices,” explained senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and materials science and engineering professor.

The training of a neural network can be done much faster and cheaper thanks to these programmable resistors than ever before. This might speed up the process by which scientists create deep learning models, which might subsequently be used for fraud detection, self-driving cars, or picture analysis in medicine.

images/20220802_2-2.jpg

▲圖片來源:dataconomy

The training of a neural network can be done much faster and cheaper thanks to these programmable resistors than ever before

“Once you have an analog processor, you will no longer be training networks everyone else is working on. You will be training networks with unprecedented complexities that no one else can afford to, and therefore vastly outperform them all. In other words, this is not a faster car; this is a spacecraft,” adds lead author and MIT postdoc Murat Onen.

The research published in Science is called “Nanosecond Protonic Programmable Resistors for Analog Deep Learning,”. It includes Frances M. Ross, the Ellen Swallow Richards Professor in the Department of Materials Science and Engineering; postdocs Nicolas Emond and Baoming Wang; and Difei Zhang, an EECS graduate student.

轉貼自: dataconomy.com

若喜歡本文,請關注我們的臉書 Please Like our Facebook Page: Big Data In Finance

 


留下你的回應

以訪客張貼回應

0
  • 找不到回應

YOU MAY BE INTERESTED

Popular Tags