Study of the robustness of neural networks based on spintronic neurons


الملخص بالإنكليزية

Spintronic technology is emerging as a direction for the hardware implementation of neurons and synapses of neuromorphic architectures. In particular, a single spintronic device can be used to implement the nonlinear activation function of neurons. Here, we propose how to implement spintronic neurons with a sigmoidal and ReLU-like activation functions. We then perform a numerical experiment showing the robustness of neural networks made by spintronic neurons all having different activation functions to emulate device-to-device variations in a possible hardware implementation of the network. Therefore, we consider a vanilla neural network implemented to recognize the categories of the Mixed National Institute of Standards and Technology database, and we show an average accuracy of 98.87 % in the test dataset which is very close to the 98.89% as obtained for the ideal case (all neurons have the same sigmoid activation function). Similar results are also obtained with neurons having a ReLU-like activation function.

تحميل البحث