Do you want to publish a course? Click here

Multiplexing rhythmic information by spike timing dependent plasticity

72   0   0.0 ( 0 )
 Added by Nimrod Sherf
 Publication date 2019
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

Rhythmic activity has been associated with a wide range of cognitive processes. Previous studies have shown that spike-timing-dependent plasticity can facilitate the transfer of rhythmic activity downstream the information processing pathway. However, STDP has also been known to generate strong winner-take-all like competitions between subgroups of correlated synaptic inputs. Consequently, one might expect that STDP would induce strong competition between different rhythmicity channels thus preventing the multiplexing of information across different frequency channels. This study explored whether STDP facilitates the multiplexing of information across multiple frequency channels, and if so, under what conditions. We investigated the STDP dynamics in the framework of a model consisting of two competing subpopulations of neurons that synapse in a feedforward manner onto a single postsynaptic neuron. Each sub-population was assumed to oscillate in an independent manner and in a different frequency band. To investigate the STDP dynamics, a mean field Fokker-Planck theory was developed in the limit of the slow learning rate. Surprisingly, our theory predicted limited interactions between the different sub-groups. Our analysis further revealed that the interaction between these channels was mainly mediated by the shared component of the mean activity. Next, we generalized these results beyond the simplistic model using numerical simulations. We found that for a wide range of parameters, the system converged to a solution in which the post-synaptic neuron responded to both rhythms. Nevertheless, all the synaptic weights remained dynamic and did not converge to a fixed point. These findings imply that STDP can support the multiplexing of rhythmic information and demonstrate how functionality can be retained in the face of continuous remodeling of all the synaptic weights.

rate research

Read More

Nanoelectronic devices that mimic the functionality of synapses are a crucial requirement for performing cortical simulations of the brain. In this work we propose a ferromagnet-heavy metal heterostructure that employs spin-orbit torque to implement Spike-Timing Dependent Plasticity. The proposed device offers the advantage of decoupled spike transmission and programming current paths, thereby leading to reliable operation during online learning. Possible arrangement of such devices in a crosspoint architecture can pave the way for ultra-dense neural networks. Simulation studies indicate that the device has the potential of achieving pico-Joule level energy consumption (maximum 2 pJ per synaptic event) which is comparable to the energy consumption for synaptic events in biological synapses.
Spike-Timing Dependent Plasticity (STDP) is believed to play an important role in learning and the formation of computational function in the brain. The classical model of STDP which considers the timing between pairs of pre-synaptic and post-synaptic spikes (p-STDP) is incapable of reproducing synaptic weight changes similar to those seen in biological experiments which investigate the effect of either higher order spike trains (e.g. triplet and quadruplet of spikes), or, simultaneous effect of the rate and timing of spike pairs on synaptic plasticity. In this paper, we firstly investigate synaptic weight changes using a p-STDP circuit and show how it fails to reproduce the mentioned complex biological experiments. We then present a new STDP VLSI circuit which acts based on the timing among triplets of spikes (t-STDP) that is able to reproduce all the mentioned experimental results. We believe that our new STDP VLSI circuit improves upon previous circuits, whose learning capacity exceeds current designs due to its capability of mimicking the outcomes of biological experiments more closely; thus plays a significant role in future VLSI implementation of neuromorphic systems.
Neural populations exposed to a certain stimulus learn to represent it better. However, the process that leads local, self-organized rules to do so is unclear. We address the question of how can a neural periodic input be learned and use the Differential Hebbian Learning framework, coupled with a homeostatic mechanism to derive two self-consistency equations that lead to increased responses to the same stimulus. Although all our simulations are done with simple Leaky-Integrate and Fire neurons and standard Spiking Time Dependent Plasticity learning rules, our results can be easily interpreted in terms of rates and population codes.
Noise is an inherent part of neuronal dynamics, and thus of the brain. It can be observed in neuronal activity at different spatiotemporal scales, including in neuronal membrane potentials, local field potentials, electroencephalography, and magnetoencephalography. A central research topic in contemporary neuroscience is to elucidate the functional role of noise in neuronal information processing. Experimental studies have shown that a suitable level of noise may enhance the detection of weak neuronal signals by means of stochastic resonance. In response, theoretical research, based on the theory of stochastic processes, nonlinear dynamics, and statistical physics, has made great strides in elucidating the mechanism and the many benefits of stochastic resonance in neuronal systems. In this perspective, we review recent research dedicated to neuronal stochastic resonance in biophysical mathematical models. We also explore the regulation of neuronal stochastic resonance, and we outline important open questions and directions for future research. A deeper understanding of neuronal stochastic resonance may afford us new insights into the highly impressive information processing in the brain.
Biological neurons receive multiple noisy oscillatory signals, and their dynamical response to the superposition of these signals is of fundamental importance for information processing in the brain. Here we study the response of neural systems to the weak envelope modulation signal, which is superimposed by two periodic signals with different frequencies. We show that stochastic resonance occurs at the beat frequency in neural systems at the single-neuron as well as the population level. The performance of this frequency-difference-dependent stochastic resonance is influenced by both the beat frequency and the two forcing frequencies. Compared to a single neuron, a population of neurons is more efficient in detecting the information carried by the weak envelope modulation signal at the beat frequency. Furthermore, an appropriate fine-tuning of the excitation-inhibition balance can further optimize the response of a neural ensemble to the superimposed signal. Our results thus introduce and provide insights into the generation and modulation mechanism of the frequency-difference-dependent stochastic resonance in neural systems.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا