Do you want to publish a course? Click here

Frequency-difference-dependent stochastic resonance in neural systems

174   0   0.0 ( 0 )
 Added by Matjaz Perc
 Publication date 2017
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

Biological neurons receive multiple noisy oscillatory signals, and their dynamical response to the superposition of these signals is of fundamental importance for information processing in the brain. Here we study the response of neural systems to the weak envelope modulation signal, which is superimposed by two periodic signals with different frequencies. We show that stochastic resonance occurs at the beat frequency in neural systems at the single-neuron as well as the population level. The performance of this frequency-difference-dependent stochastic resonance is influenced by both the beat frequency and the two forcing frequencies. Compared to a single neuron, a population of neurons is more efficient in detecting the information carried by the weak envelope modulation signal at the beat frequency. Furthermore, an appropriate fine-tuning of the excitation-inhibition balance can further optimize the response of a neural ensemble to the superimposed signal. Our results thus introduce and provide insights into the generation and modulation mechanism of the frequency-difference-dependent stochastic resonance in neural systems.



rate research

Read More

Rhythmic activity has been associated with a wide range of cognitive processes. Previous studies have shown that spike-timing-dependent plasticity can facilitate the transfer of rhythmic activity downstream the information processing pathway. However, STDP has also been known to generate strong winner-take-all like competitions between subgroups of correlated synaptic inputs. Consequently, one might expect that STDP would induce strong competition between different rhythmicity channels thus preventing the multiplexing of information across different frequency channels. This study explored whether STDP facilitates the multiplexing of information across multiple frequency channels, and if so, under what conditions. We investigated the STDP dynamics in the framework of a model consisting of two competing subpopulations of neurons that synapse in a feedforward manner onto a single postsynaptic neuron. Each sub-population was assumed to oscillate in an independent manner and in a different frequency band. To investigate the STDP dynamics, a mean field Fokker-Planck theory was developed in the limit of the slow learning rate. Surprisingly, our theory predicted limited interactions between the different sub-groups. Our analysis further revealed that the interaction between these channels was mainly mediated by the shared component of the mean activity. Next, we generalized these results beyond the simplistic model using numerical simulations. We found that for a wide range of parameters, the system converged to a solution in which the post-synaptic neuron responded to both rhythms. Nevertheless, all the synaptic weights remained dynamic and did not converge to a fixed point. These findings imply that STDP can support the multiplexing of rhythmic information and demonstrate how functionality can be retained in the face of continuous remodeling of all the synaptic weights.
Noise is an inherent part of neuronal dynamics, and thus of the brain. It can be observed in neuronal activity at different spatiotemporal scales, including in neuronal membrane potentials, local field potentials, electroencephalography, and magnetoencephalography. A central research topic in contemporary neuroscience is to elucidate the functional role of noise in neuronal information processing. Experimental studies have shown that a suitable level of noise may enhance the detection of weak neuronal signals by means of stochastic resonance. In response, theoretical research, based on the theory of stochastic processes, nonlinear dynamics, and statistical physics, has made great strides in elucidating the mechanism and the many benefits of stochastic resonance in neuronal systems. In this perspective, we review recent research dedicated to neuronal stochastic resonance in biophysical mathematical models. We also explore the regulation of neuronal stochastic resonance, and we outline important open questions and directions for future research. A deeper understanding of neuronal stochastic resonance may afford us new insights into the highly impressive information processing in the brain.
The understanding of neural activity patterns is fundamentally linked to an understanding of how the brains network architecture shapes dynamical processes. Established approaches rely mostly on deviations of a given network from certain classes of random graphs. Hypotheses about the supposed role of prominent topological features (for instance, the roles of modularity, network motifs, or hierarchical network organization) are derived from these deviations. An alternative strategy could be to study deviations of network architectures from regular graphs (rings, lattices) and consider the implications of such deviations for self-organized dynamic patterns on the network. Following this strategy, we draw on the theory of spatiotemporal pattern formation and propose a novel perspective for analyzing dynamics on networks, by evaluating how the self-organized dynamics are confined by network architecture to a small set of permissible collective states. In particular, we discuss the role of prominent topological features of brain connectivity, such as hubs, modules and hierarchy, in shaping activity patterns. We illustrate the notion of network-guided pattern formation with numerical simulations and outline how it can facilitate the understanding of neural dynamics.
The activity of a sparse network of leaky integrate-and-fire neurons is carefully revisited with reference to a regime of a bona-fide asynchronous dynamics. The study is preceded by a finite-size scaling analysis, carried out to identify a setup where collective synchronization is negligible. The comparison between quenched and annealed networks reveals the emergence of substantial differences when the coupling strength is increased, via a scenario somehow reminiscent of a phase transition. For sufficiently strong synaptic coupling, quenched networks exhibit a highly bursting neural activity, well reproduced by a self-consistent approach, based on the assumption that the input synaptic current is the superposition of independent renewal processes. The distribution of interspike intervals turns out to be relatively long-tailed; a crucial feature required for the self-sustainment of the bursting activity in a regime where neurons operate on average (much) below threshold. A semi-quantitative analogy with Ornstein-Uhlenbeck processes helps validating this interpretation. Finally, an alternative explanation in terms of Poisson processes is offered under the additional assumption of mutual correlations among excitatory and inhibitory spikes.
Neurons modeled by the Rulkov map display a variety of dynamic regimes that include tonic spikes and chaotic bursting. Here we study an ensemble of bursting neurons coupled with the Watts-Strogatz small-world topology. We characterize the sequences of bursts using the symbolic method of time-series analysis known as ordinal analysis, which detects nonlinear temporal correlations. We show that the probabilities of the different symbols distinguish different dynamical regimes, which depend on the coupling strength and the network topology. These regimes have different spatio-temporal properties that can be visualized with raster plots.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا