ترغب بنشر مسار تعليمي؟ اضغط هنا

Associative memory of phase-coded spatiotemporal patterns in leaky Integrate and Fire networks

122   0   0.0 ( 0 )
 نشر من قبل Ferdinando Giacco
 تاريخ النشر 2012
  مجال البحث علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We study the collective dynamics of a Leaky Integrate and Fire network in which precise relative phase relationship of spikes among neurons are stored, as attractors of the dynamics, and selectively replayed at differentctime scales. Using an STDP-based learning process, we store in the connectivity several phase-coded spike patterns, and we find that, depending on the excitability of the network, different working regimes are possible, with transient or persistent replay activity induced by a brief signal. We introduce an order parameter to evaluate the similarity between stored and recalled phase-coded pattern, and measure the storage capacity. Modulation of spiking thresholds during replay changes the frequency of the collective oscillation or the number of spikes per cycle, keeping preserved the phases relationship. This allows a coding scheme in which phase, rate and frequency are dissociable. Robustness with respect to noise and heterogeneity of neurons parameters is studied, showing that, since dynamics is a retrieval process, neurons preserve stablecprecise phase relationship among units, keeping a unique frequency of oscillation, even in noisy conditions and with heterogeneity of internal parameters of the units.

قيم البحث

اقرأ أيضاً

We analyse the storage and retrieval capacity in a recurrent neural network of spiking integrate and fire neurons. In the model we distinguish between a learning mode, during which the synaptic connections change according to a Spike-Timing Dependent Plasticity (STDP) rule, and a recall mode, in which connections strengths are no more plastic. Our findings show the ability of the network to store and recall periodic phase coded patterns a small number of neurons has been stimulated. The self sustained dynamics selectively gives an oscillating spiking activity that matches one of the stored patterns, depending on the initialization of the network.
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent neural networks with sparse connectivity. To determine the synaptic strength of existent connections and store the phase-coded patterns, we introduce a learning rule inspired to the spike-timing dependent plasticity (STDP). We find that, after learning, the spontaneous dynamics of the network replay one of the stored dynamical patterns, depending on the network initialization. We study the network capacity as a function of topology, and find that a small- world-like topology may be optimal, as a compromise between the high wiring cost of long range connections and the capacity increase.
Statistical properties of spike trains as well as other neurophysiological data suggest a number of mathematical models of neurons. These models range from entirely descriptive ones to those deduced from the properties of the real neurons. One of the m, the diffusion leaky integrate-and-fire neuronal model, which is based on the Ornstein-Uhlenbeck stochastic process that is restricted by an absorbing barrier, can describe a wide range of neuronal activity in terms of its parameters. These parameters are readily associated with known physiological mechanisms. The other model is descriptive, Gamma renewal process, and its parameters only reflect the observed experimental data or assumed theoretical properties. Both of these commonly used models are related here. We show under which conditions the Gamma model is an output from the diffusion Ornstein-Uhlenbeck model. In some cases we can see that the Gamma distribution is unrealistic to be achieved for the employed parameters of the Ornstein-Uhlenbeck process.
The effects of nonlocal and reflecting connectivity are investigated in coupled Leaky Integrate-and-Fire (LIF) elements, which assimilate the exchange of electrical signals between neurons. Earlier investigations have demonstrated that non-local and hierarchical network connectivity often induces complex synchronization patterns and chimera states in systems of coupled oscillators. In the LIF system we show that if the elements are non-locally linked with positive diffusive coupling in a ring architecture the system splits into a number of alternating domains. Half of these domains contain elements, whose potential stays near the threshold, while they are interrupted by active domains, where the elements perform regular LIF oscillations. The active domains move around the ring with constant velocity, depending on the system parameters. The idea of introducing reflecting non-local coupling in LIF networks originates from signal exchange between neurons residing in the two hemispheres in the brain. We show evidence that this connectivity induces novel complex spatial and temporal structures: for relatively extensive ranges of parameter values the system splits in two coexisting domains, one domain where all elements stay near-threshold and one where incoherent states develop with multileveled mean phase velocity distribution.
Spiking neural networks (SNNs) based on Leaky Integrate and Fire (LIF) model have been applied to energy-efficient temporal and spatiotemporal processing tasks. Thanks to the bio-plausible neuronal dynamics and simplicity, LIF-SNN benefits from event -driven processing, however, usually faces the embarrassment of reduced performance. This may because in LIF-SNN the neurons transmit information via spikes. To address this issue, in this work, we propose a Leaky Integrate and Analog Fire (LIAF) neuron model, so that analog values can be transmitted among neurons, and a deep network termed as LIAF-Net is built on it for efficient spatiotemporal processing. In the temporal domain, LIAF follows the traditional LIF dynamics to maintain its temporal processing capability. In the spatial domain, LIAF is able to integrate spatial information through convolutional integration or fully-connected integration. As a spatiotemporal layer, LIAF can also be used with traditional artificial neural network (ANN) layers jointly. Experiment results indicate that LIAF-Net achieves comparable performance to Gated Recurrent Unit (GRU) and Long short-term memory (LSTM) on bAbI Question Answering (QA) tasks, and achieves state-of-the-art performance on spatiotemporal Dynamic Vision Sensor (DVS) datasets, including MNIST-DVS, CIFAR10-DVS and DVS128 Gesture, with much less number of synaptic weights and computational overhead compared with traditional networks built by LSTM, GRU, Convolutional LSTM (ConvLSTM) or 3D convolution (Conv3D). Compared with traditional LIF-SNN, LIAF-Net also shows dramatic accuracy gain on all these experiments. In conclusion, LIAF-Net provides a framework combining the advantages of both ANNs and SNNs for lightweight and efficient spatiotemporal information processing.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا