ترغب بنشر مسار تعليمي؟ اضغط هنا

Mapping Input Noise to Escape Noise in Integrate-and-fire neurons: A Level-Crossing Approach

199   0   0.0 ( 0 )
 نشر من قبل Tilo Schwalger
 تاريخ النشر 2021
  مجال البحث علم الأحياء
والبحث باللغة English
 تأليف Tilo Schwalger




اسأل ChatGPT حول البحث

Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate (escape noise). While input noise lends itself to modeling biophysical noise processes, the phenomenological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise.



قيم البحث

اقرأ أيضاً

We derive analytical formulae for the firing rate of integrate-and-fire neurons endowed with realistic synaptic dynamics. In particular we include the possibility of multiple synaptic inputs as well as the effect of an absolute refractory period into the description.
Voltage-sensitive dye imaging (VSDi) has revealed fundamental properties of neocortical processing at mesoscopic scales. Since VSDi signals report the average membrane potential, it seems natural to use a mean-field formalism to model such signals. H ere, we investigate a mean-field model of networks of Adaptive Exponential (AdEx) integrate-and-fire neurons, with conductance-based synaptic interactions. The AdEx model can capture the spiking response of different cell types, such as regular-spiking (RS) excitatory neurons and fast-spiking (FS) inhibitory neurons. We use a Master Equation formalism, together with a semi-analytic approach to the transfer function of AdEx neurons. We compare the predictions of this mean-field model to simulated networks of RS-FS cells, first at the level of the spontaneous activity of the network, which is well predicted by the mean-field model. Second, we investigate the response of the network to time-varying external input, and show that the mean-field model accurately predicts the response time course of the population. One notable exception was that the tail of the response at long times was not well predicted, because the mean-field does not include adaptation mechanisms. We conclude that the Master Equation formalism can yield mean-field models that predict well the behavior of nonlinear networks with conductance-based interactions and various electrophysiolgical properties, and should be a good candidate to model VSDi signals where both excitatory and inhibitory neurons contribute.
Collective oscillations and their suppression by external stimulation are analyzed in a large-scale neural network consisting of two interacting populations of excitatory and inhibitory quadratic integrate-and-fire neurons. In the limit of an infinit e number of neurons, the microscopic model of this network can be reduced to an exact low-dimensional system of mean-field equations. Bifurcation analysis of these equations reveals three different dynamic modes in a free network: a stable resting state, a stable limit cycle, and bistability with a coexisting resting state and a limit cycle. We show that in the limit cycle mode, high-frequency stimulation of an inhibitory population can stabilize an unstable resting state and effectively suppress collective oscillations. We also show that in the bistable mode, the dynamics of the network can be switched from a stable limit cycle to a stable resting state by applying an inhibitory pulse to the excitatory population. The results obtained from the mean-field equations are confirmed by numerical simulation of the microscopic model.
The spiking activity of single neurons can be well described by a nonlinear integrate-and-fire model that includes somatic adaptation. When exposed to fluctuating inputs sparsely coupled populations of these model neurons exhibit stochastic collectiv e dynamics that can be effectively characterized using the Fokker-Planck equation. [...] Here we derive from that description four simple models for the spike rate dynamics in terms of low-dimensional ordinary differential equations using two different reduction techniques: one uses the spectral decomposition of the Fokker-Planck operator, the other is based on a cascade of two linear filters and a nonlinearity, which are determined from the Fokker-Planck equation and semi-analytically approximated. We evaluate the reduced models for a wide range of biologically plausible input statistics and find that both approximation approaches lead to spike rate models that accurately reproduce the spiking behavior of the underlying adaptive integrate-and-fire population. [...] The low-dimensional models also well reproduce stable oscillatory spike rate dynamics that is generated by recurrent synaptic excitation and neuronal adaptation. [...] We have made available implementations that allow to numerically integrate the low-dimensional spike rate models as well as the Fokker-Planck partial differential equation in efficient ways for arbitrary model parametrizations as open source software. The derived spike rate descriptions retain a direct link to the properties of single neurons, allow for convenient mathematical analyses of network states, and are well suited for application in neural mass/mean-field based brain network models.
To estimate the time, many organisms, ranging from cyanobacteria to animals, employ a circadian clock which is based on a limit-cycle oscillator that can tick autonomously with a nearly 24h period. Yet, a limit-cycle oscillator is not essential for k nowing the time, as exemplified by bacteria that possess an hourglass: a system that when forced by an oscillatory light input exhibits robust oscillations from which the organism can infer the time, but that in the absence of driving relaxes to a stable fixed point. Here, using models of the Kai system of cyanobacteria, we compare a limit- cycle oscillator with two hourglass models, one that without driving relaxes exponentially and one that does so in an oscillatory fashion. In the limit of low input-noise, all three systems are equally informative on time, yet in the regime of high input-noise the limit-cycle oscillator is far superior. The same behavior is found in the Stuart-Landau model, indicating that our result is universal.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا