Two neurons coupled by unreliable synapses are modeled by leaky integrate-and-fire neurons and stochastic on-off synapses. The dynamics is mapped to an iterated function system. Numerical calculations yield a multifractal distribution of interspike intervals. The Haussdorf, entropy and correlation dimensions are calculated as a function of synaptic strength and transmission probability.
We consider the generalization problem for a perceptron with binary synapses, implementing the Stochastic Belief-Propagation-Inspired (SBPI) learning algorithm which we proposed earlier, and perform a mean-field calculation to obtain a differential equation which describes the behaviour of the device in the limit of a large number of synapses N. We show that the solving time of SBPI is of order N*sqrt(log(N)), while the similar, well-known clipped perceptron (CP) algorithm does not converge to a solution at all in the time frame we considered. The analysis gives some insight into the ongoing process and shows that, in this context, the SBPI algorithm is equivalent to a new, simpler algorithm, which only differs from the CP algorithm by the addition of a stochastic, unsupervised meta-plastic reinforcement process, whose rate of application must be less than sqrt(2/(pi * N)) for the learning to be achieved effectively. The analytical results are confirmed by simulations.
Exploiting the physics of nanoelectronic devices is a major lead for implementing compact, fast, and energy efficient artificial intelligence. In this work, we propose an original road in this direction, where assemblies of spintronic resonators used as artificial synapses can classify an-alogue radio-frequency signals directly without digitalization. The resonators convert the ra-dio-frequency input signals into direct voltages through the spin-diode effect. In the process, they multiply the input signals by a synaptic weight, which depends on their resonance fre-quency. We demonstrate through physical simulations with parameters extracted from exper-imental devices that frequency-multiplexed assemblies of resonators implement the corner-stone operation of artificial neural networks, the Multiply-And-Accumulate (MAC), directly on microwave inputs. The results show that even with a non-ideal realistic model, the outputs obtained with our architecture remain comparable to that of a traditional MAC operation. Us-ing a conventional machine learning framework augmented with equations describing the physics of spintronic resonators, we train a single layer neural network to classify radio-fre-quency signals encoding 8x8 pixel handwritten digits pictures. The spintronic neural network recognizes the digits with an accuracy of 99.96 %, equivalent to purely software neural net-works. This MAC implementation offers a promising solution for fast, low-power radio-fre-quency classification applications, and a new building block for spintronic deep neural net-works.
Anomalously localized states (ALS) at the critical point of the Anderson transition are studied for the SU(2) model belonging to the two-dimensional symplectic class. Giving a quantitative definition of ALS to clarify statistical properties of them, the system-size dependence of a probability to find ALS at criticality is presented. It is found that the probability increases with the system size and ALS exist with a finite probability even in an infinite critical system, though the typical critical states are kept to be multifractal. This fact implies that ALS should be eliminated from an ensemble of critical states when studying critical properties from distributions of critical quantities. As a demonstration of the effect of ALS to critical properties, we show that the distribution function of the correlation dimension of critical wavefunctions becomes a delta function in the thermodynamic limit only if ALS are eliminated.
We study the recognition capabilities of the Hopfield model with auxiliary hidden layers, which emerge naturally upon a Hubbard-Stratonovich transformation. We show that the recognition capabilities of such a model at zero-temperature outperform those of the original Hopfield model, due to a substantial increase of the storage capacity and the lack of a naturally defined basin of attraction. The modified model does not fall abruptly in a regime of complete confusion when memory load exceeds a sharp threshold.
Many real-world complex systems have small-world topology characterized by the high clustering of nodes and short path lengths.It is well-known that higher clustering drives localization while shorter path length supports delocalization of the eigenvectors of networks. Using multifractals technique, we investigate localization properties of the eigenvectors of the adjacency matrices of small-world networks constructed using Watts-Strogatz algorithm. We find that the central part of the eigenvalue spectrum is characterized by strong multifractality whereas the tail part of the spectrum have Dq->1. Before the onset of the small-world transition, an increase in the random connections leads to an enhancement in the eigenvectors localization, whereas just after the onset, the eigenvectors show a gradual decrease in the localization. We have verified an existence of sharp change in the correlation dimension at the localization-delocalization transition
Johannes Kestler
,Wolfgang Kinzel
.
(2006)
.
"Multifractal distribution of spike intervals for two neurons with unreliable synapses"
.
Johannes Kestler
هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا