We derive analytical formulae for the firing rate of integrate-and-fire neurons endowed with realistic synaptic dynamics. In particular we include the possibility of multiple synaptic inputs as well as the effect of an absolute refractory period into the description.
The spiking activity of single neurons can be well described by a nonlinear integrate-and-fire model that includes somatic adaptation. When exposed to fluctuating inputs sparsely coupled populations of these model neurons exhibit stochastic collectiv
e dynamics that can be effectively characterized using the Fokker-Planck equation. [...] Here we derive from that description four simple models for the spike rate dynamics in terms of low-dimensional ordinary differential equations using two different reduction techniques: one uses the spectral decomposition of the Fokker-Planck operator, the other is based on a cascade of two linear filters and a nonlinearity, which are determined from the Fokker-Planck equation and semi-analytically approximated. We evaluate the reduced models for a wide range of biologically plausible input statistics and find that both approximation approaches lead to spike rate models that accurately reproduce the spiking behavior of the underlying adaptive integrate-and-fire population. [...] The low-dimensional models also well reproduce stable oscillatory spike rate dynamics that is generated by recurrent synaptic excitation and neuronal adaptation. [...] We have made available implementations that allow to numerically integrate the low-dimensional spike rate models as well as the Fokker-Planck partial differential equation in efficient ways for arbitrary model parametrizations as open source software. The derived spike rate descriptions retain a direct link to the properties of single neurons, allow for convenient mathematical analyses of network states, and are well suited for application in neural mass/mean-field based brain network models.
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects of stochastic spiking neurons such as
the non-stationary population response to rapidly changing stimuli. Here, we derive low-dimensional firing-rate models for homogeneous populations of general renewal-type neurons, including integrate-and-fire models driven by white noise. Renewal models account for neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues, which determine the characteristic time scales of the firing rate dynamics, and the Laplace transform of the interspike interval density or the survival function of the renewal process. Analytical expressions for the Laplace transforms are readily available for many renewal models including the leaky integrate-and-fire model. Retaining only the first eigenmode yields already an adequate low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness, and other renewal models that admit an explicit analytical calculation of the eigenvalues. The here presented eigenmode expansion provides a systematic framework for novel firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
Voltage-sensitive dye imaging (VSDi) has revealed fundamental properties of neocortical processing at mesoscopic scales. Since VSDi signals report the average membrane potential, it seems natural to use a mean-field formalism to model such signals. H
ere, we investigate a mean-field model of networks of Adaptive Exponential (AdEx) integrate-and-fire neurons, with conductance-based synaptic interactions. The AdEx model can capture the spiking response of different cell types, such as regular-spiking (RS) excitatory neurons and fast-spiking (FS) inhibitory neurons. We use a Master Equation formalism, together with a semi-analytic approach to the transfer function of AdEx neurons. We compare the predictions of this mean-field model to simulated networks of RS-FS cells, first at the level of the spontaneous activity of the network, which is well predicted by the mean-field model. Second, we investigate the response of the network to time-varying external input, and show that the mean-field model accurately predicts the response time course of the population. One notable exception was that the tail of the response at long times was not well predicted, because the mean-field does not include adaptation mechanisms. We conclude that the Master Equation formalism can yield mean-field models that predict well the behavior of nonlinear networks with conductance-based interactions and various electrophysiolgical properties, and should be a good candidate to model VSDi signals where both excitatory and inhibitory neurons contribute.
Noise in spiking neurons is commonly modeled by a noisy input current or by generating output spikes stochastically with a voltage-dependent hazard rate (escape noise). While input noise lends itself to modeling biophysical noise processes, the pheno
menological escape noise is mathematically more tractable. Using the level-crossing theory for differentiable Gaussian processes, we derive an approximate mapping between colored input noise and escape noise in leaky integrate-and-fire neurons. This mapping requires the first-passage-time (FPT) density of an overdamped Brownian particle driven by colored noise with respect to an arbitrarily moving boundary. Starting from the Wiener-Rice series for the FPT density, we apply the second-order decoupling approximation of Stratonovich to the case of moving boundaries and derive a simplified hazard-rate representation that is local in time and numerically efficient. This simplification requires the calculation of the non-stationary auto-correlation function of the level-crossing process: For exponentially correlated input noise (Ornstein-Uhlenbeck process), we obtain an exact formula for the zero-lag auto-correlation as a function of noise parameters, mean membrane potential and its speed, as well as an exponential approximation of the full auto-correlation function. The theory well predicts the FPT and interspike interval densities as well as the population activities obtained from simulations with time-dependent stimulus or boundary. The agreement with simulations is strongly enhanced compared to a first-order decoupling approximation that neglects correlations between level crossings. The second-order approximation also improves upon a previously proposed theory in the subthreshold regime. Depending on a simplicity-accuracy trade-off, all considered approximations represent useful mappings from colored input noise to escape noise.
Collective oscillations and their suppression by external stimulation are analyzed in a large-scale neural network consisting of two interacting populations of excitatory and inhibitory quadratic integrate-and-fire neurons. In the limit of an infinit
e number of neurons, the microscopic model of this network can be reduced to an exact low-dimensional system of mean-field equations. Bifurcation analysis of these equations reveals three different dynamic modes in a free network: a stable resting state, a stable limit cycle, and bistability with a coexisting resting state and a limit cycle. We show that in the limit cycle mode, high-frequency stimulation of an inhibitory population can stabilize an unstable resting state and effectively suppress collective oscillations. We also show that in the bistable mode, the dynamics of the network can be switched from a stable limit cycle to a stable resting state by applying an inhibitory pulse to the excitatory population. The results obtained from the mean-field equations are confirmed by numerical simulation of the microscopic model.