ترغب بنشر مسار تعليمي؟ اضغط هنا

Statistics of spikes trains, synaptic plasticity and Gibbs distributions

302   0   0.0 ( 0 )
 نشر من قبل Bruno. Cessac
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We introduce a mathematical framework where the statistics of spikes trains, produced by neural networks evolving under synaptic plasticity, can be analysed.

قيم البحث

اقرأ أيضاً

This paper addresses two questions in the context of neuronal networks dynamics, using methods from dynamical systems theory and statistical physics: (i) How to characterize the statistical properties of sequences of action potentials (spike trains) produced by neuronal networks ? and; (ii) what are the effects of synaptic plasticity on these statistics ? We introduce a framework in which spike trains are associated to a coding of membrane potential trajectories, and actually, constitute a symbolic coding in important explicit examples (the so-called gIF models). On this basis, we use the thermodynamic formalism from ergodic theory to show how Gibbs distributions are natural probability measures to describe the statistics of spike trains, given the empirical averages of prescribed quantities. As a second result, we show that Gibbs distributions naturally arise when considering slow synaptic plasticity rules where the characteristic time for synapse adaptation is quite longer than the characteristic time for neurons dynamics.
We show that the local Spike Timing-Dependent Plasticity (STDP) rule has the effect of regulating the trans-synaptic weights of loops of any length within a simulated network of neurons. We show that depending on STDPs polarity, functional loops are formed or eliminated in networks driven to normal spiking conditions by random, partially correlated inputs, where functional loops comprise weights that exceed a non-zero threshold. We further prove that STDP is a form of loop-regulating plasticity for the case of a linear network comprising random weights drawn from certain distributions. Thus a notable local synaptic learning rule makes a specific prediction about synapses in the brain in which standard STDP is present: that under normal spiking conditions, they should participate in predominantly feed-forward connections at all scales. Our model implies that any deviations from this prediction would require a substantial modification to the hypothesized role for standard STDP. Given its widespread occurrence in the brain, we predict that STDP could also regulate long range synaptic loops among individual neurons across all brain scales, up to, and including, the scale of global brain network topology.
A novel possibility of self-organized behaviour of stochastically driven oscillators is presented. It is shown that synchronization by Levy stable processes is significantly more efficient than that by oscillators with Gaussian statistics. The impact of outlier events from the tail of the distribution function was examined by artificially introducing a few additional oscillators with very strong coupling strengths and it is found that remarkably even one such rare and extreme event may govern the long term behaviour of the coupled system. In addition to the multiplicative noise component, we have investigated the impact of an external additive Levy distributed noise component on the synchronisation properties of the oscillators.
Synaptic plasticity is the capacity of a preexisting connection between two neurons to change in strength as a function of neural activity. Because synaptic plasticity is the major candidate mechanism for learning and memory, the elucidation of its c onstituting mechanisms is of crucial importance in many aspects of normal and pathological brain function. In particular, a prominent aspect that remains debated is how the plasticity mechanisms, that encompass a broad spectrum of temporal and spatial scales, come to play together in a concerted fashion. Here we review and discuss evidence that pinpoints to a possible non-neuronal, glial candidate for such orchestration: the regulation of synaptic plasticity by astrocytes.
In this work we present a general mechanism by which simple dynamics running on networks become self-organized critical for scale free topologies. We illustrate this mechanism with a simple arithmetic model of division between integers, the division model. This is the simplest self-organized critical model advanced so far, and in this sense it may help to elucidate the mechanism of self-organization to criticality. Its simplicity allows analytical tractability, characterizing several scaling relations. Furthermore, its mathematical nature brings about interesting connections between statistical physics and number theoretical concepts. We show how this model can be understood as a self-organized stochastic process embedded on a network, where the onset of criticality is induced by the topology.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا