ﻻ يوجد ملخص باللغة العربية
Neural networks are able to extract information from the timing of spikes. Here we provide new results on the behavior of the simplest neuronal model which is able to decode information embedded in temporal spike patterns, the so called tempotron. Using statistical physics techniques we compute the capacity for the case of sparse, time-discretized input, and material discrete synapses, showing that the device saturates the information theoretic bounds with a statistics of output spikes that is consistent with the statistics of the inputs. We also derive two simple and highly efficient learning algorithms which are able to learn a number of associations which are close to the theoretical limit. The simple
Recent advances in deep learning and neural networks have led to an increased interest in the application of generative models in statistical and condensed matter physics. In particular, restricted Boltzmann machines (RBMs) and variational autoencode
Anticipation is a strategy used by neural fields to compensate for transmission and processing delays during the tracking of dynamical information, and can be achieved by slow, localized, inhibitory feedback mechanisms such as short-term synaptic dep
A universal supervised neural network (NN) relevant to compute the associated criticalities of real experiments studying phase transitions is constructed. The validity of the built NN is examined by applying it to calculate the criticalities of sever
Maximum entropy models provide the least constrained probability distributions that reproduce statistical properties of experimental datasets. In this work we characterize the learning dynamics that maximizes the log-likelihood in the case of large b
Neuronal ensemble inference is a significant problem in the study of biological neural networks. Various methods have been proposed for ensemble inference from experimental data of neuronal activity. Among them, Bayesian inference approach with gener