ترغب بنشر مسار تعليمي؟ اضغط هنا

Radio-Frequency Multiply-And-Accumulate Operations with Spintronic Synapses

103   0   0.0 ( 0 )
 نشر من قبل Nathan Leroux
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English
 تأليف N. Leroux




اسأل ChatGPT حول البحث

Exploiting the physics of nanoelectronic devices is a major lead for implementing compact, fast, and energy efficient artificial intelligence. In this work, we propose an original road in this direction, where assemblies of spintronic resonators used as artificial synapses can classify an-alogue radio-frequency signals directly without digitalization. The resonators convert the ra-dio-frequency input signals into direct voltages through the spin-diode effect. In the process, they multiply the input signals by a synaptic weight, which depends on their resonance fre-quency. We demonstrate through physical simulations with parameters extracted from exper-imental devices that frequency-multiplexed assemblies of resonators implement the corner-stone operation of artificial neural networks, the Multiply-And-Accumulate (MAC), directly on microwave inputs. The results show that even with a non-ideal realistic model, the outputs obtained with our architecture remain comparable to that of a traditional MAC operation. Us-ing a conventional machine learning framework augmented with equations describing the physics of spintronic resonators, we train a single layer neural network to classify radio-fre-quency signals encoding 8x8 pixel handwritten digits pictures. The spintronic neural network recognizes the digits with an accuracy of 99.96 %, equivalent to purely software neural net-works. This MAC implementation offers a promising solution for fast, low-power radio-fre-quency classification applications, and a new building block for spintronic deep neural net-works.



قيم البحث

اقرأ أيضاً

Artificial neural networks are a valuable tool for radio-frequency (RF) signal classification in many applications, but digitization of analog signals and the use of general purpose hardware non-optimized for training make the process slow and energe tically costly. Recent theoretical work has proposed to use nano-devices called magnetic tunnel junctions, which exhibit intrinsic RF dynamics, to implement in hardware the Multiply and Accumulate (MAC) operation, a key building block of neural networks, directly using analogue RF signals. In this article, we experimentally demonstrate that a magnetic tunnel junction can perform multiplication of RF powers, with tunable positive and negative synaptic weights. Using two magnetic tunnel junctions connected in series we demonstrate the MAC operation and use it for classification of RF signals. These results open the path to embedded systems capable of analyzing RF signals with neural networks directly after the antenna, at low power cost and high speed.
257 - Carlo Baldassi 2012
We consider the generalization problem for a perceptron with binary synapses, implementing the Stochastic Belief-Propagation-Inspired (SBPI) learning algorithm which we proposed earlier, and perform a mean-field calculation to obtain a differential e quation which describes the behaviour of the device in the limit of a large number of synapses N. We show that the solving time of SBPI is of order N*sqrt(log(N)), while the similar, well-known clipped perceptron (CP) algorithm does not converge to a solution at all in the time frame we considered. The analysis gives some insight into the ongoing process and shows that, in this context, the SBPI algorithm is equivalent to a new, simpler algorithm, which only differs from the CP algorithm by the addition of a stochastic, unsupervised meta-plastic reinforcement process, whose rate of application must be less than sqrt(2/(pi * N)) for the learning to be achieved effectively. The analytical results are confirmed by simulations.
Two neurons coupled by unreliable synapses are modeled by leaky integrate-and-fire neurons and stochastic on-off synapses. The dynamics is mapped to an iterated function system. Numerical calculations yield a multifractal distribution of interspike i ntervals. The Haussdorf, entropy and correlation dimensions are calculated as a function of synaptic strength and transmission probability.
We report a transition from asynchronous to oscillatory behaviour in balanced inhibitory networks for class I and II neurons with instantaneous synapses. Collective oscillations emerge for sufficiently connected networks. Their origin is understood i n terms of a recently developed mean-field model, whose stable solution is a focus. Microscopic irregular firings, due to balance, trigger sustained oscillations by exciting the relaxation dynamics towards the macroscopic focus. The same mechanism induces in balanced excitatory-inhibitory networks quasi-periodic collective oscillations.
We show that discrete synaptic weights can be efficiently used for learning in large scale neural systems, and lead to unanticipated computational performance. We focus on the representative case of learning random patterns with binary synapses in si ngle layer networks. The standard statistical analysis shows that this problem is exponentially dominated by isolated solutions that are extremely hard to find algorithmically. Here, we introduce a novel method that allows us to find analytical evidence for the existence of subdominant and extremely dense regions of solutions. Numerical experiments confirm these findings. We also show that the dense regions are surprisingly accessible by simple learning protocols, and that these synaptic configurations are robust to perturbations and generalize better than typical solutions. These outcomes extend to synapses with multiple states and to deeper neural architectures. The large deviation measure also suggests how to design novel algorithmic schemes for optimization based on local entropy maximization.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا