No Arabic abstract
Fast-spiking (FS) interneurons in the brain are self-innervated by powerful inhibitory GABAergic autaptic connections. By computational modelling, we investigate how autaptic inhibition regulates the firing response of such interneurons. Our results indicate that autaptic inhibition both boosts the current threshold for action potential generation as well as modulates the input-output gain of FS interneurons. The autaptic transmission delay is identified as a key parameter that controls the firing patterns and determines multistability regions of FS interneurons. Furthermore, we observe that neuronal noise influences the firing regulation of FS interneurons by autaptic inhibition and extends their dynamic range for encoding inputs. Importantly, autaptic inhibition modulates noise-induced irregular firing of FS interneurons, such that coherent firing appears at an optimal autaptic inhibition level. Our result reveal the functional roles of autaptic inhibition in taming the firing dynamics of FS interneurons.
The importance of self-feedback autaptic transmission in modulating spike-time irregularity is still poorly understood. By using a biophysical model that incorporates autaptic coupling, we here show that self-innervation of neurons participates in the modulation of irregular neuronal firing, primarily by regulating the occurrence frequency of burst firing. In particular, we find that both excitatory and electrical autapses increase the occurrence of burst firing, thus reducing neuronal firing regularity. In contrast, inhibitory autapses suppress burst firing and therefore tend to improve the regularity of neuronal firing. Importantly, we show that these findings are independent of the firing properties of individual neurons, and as such can be observed for neurons operating in different modes. Our results provide an insightful mechanistic understanding of how different types of autapses shape irregular firing at the single-neuron level, and they highlight the functional importance of autaptic self-innervation in taming and modulating neurodynamics.
The macroscopic dynamics of large populations of neurons can be mathematically analyzed using low-dimensional firing-rate or neural-mass models. However, these models fail to capture spike synchronization effects of stochastic spiking neurons such as the non-stationary population response to rapidly changing stimuli. Here, we derive low-dimensional firing-rate models for homogeneous populations of general renewal-type neurons, including integrate-and-fire models driven by white noise. Renewal models account for neuronal refractoriness and spike synchronization dynamics. The derivation is based on an eigenmode expansion of the associated refractory density equation, which generalizes previous spectral methods for Fokker-Planck equations to arbitrary renewal models. We find a simple relation between the eigenvalues, which determine the characteristic time scales of the firing rate dynamics, and the Laplace transform of the interspike interval density or the survival function of the renewal process. Analytical expressions for the Laplace transforms are readily available for many renewal models including the leaky integrate-and-fire model. Retaining only the first eigenmode yields already an adequate low-dimensional approximation of the firing-rate dynamics that captures spike synchronization effects and fast transient dynamics at stimulus onset. We explicitly demonstrate the validity of our model for a large homogeneous population of Poisson neurons with absolute refractoriness, and other renewal models that admit an explicit analytical calculation of the eigenvalues. The here presented eigenmode expansion provides a systematic framework for novel firing-rate models in computational neuroscience based on spiking neuron dynamics with refractoriness.
The dominant modeling framework for understanding cortical computations are heuristic firing rate models. Despite their success, these models fall short to capture spike synchronization effects, to link to biophysical parameters and to describe finite-size fluctuations. In this opinion article, we propose that the refractory density method (RDM), also known as age-structured population dynamics or quasi-renewal theory, yields a powerful theoretical framework to build rate-based models for mesoscopic neural populations from realistic neuron dynamics at the microscopic level. We review recent advances achieved by the RDM to obtain efficient population density equations for networks of generalized integrate-and-fire (GIF) neurons -- a class of neuron models that has been successfully fitted to various cell types. The theory not only predicts the nonstationary dynamics of large populations of neurons but also permits an extension to finite-size populations and a systematic reduction to low-dimensional rate dynamics. The new types of rate models will allow a re-examination of models of cortical computations under biological constraints.
We consider a classical space-clamped Hodgkin-Huxley model neuron stimulated by synaptic excitation and inhibition with conductances represented by Ornstein-Uhlenbeck processes. Using numerical solutions of the stochastic model system obtained by an Euler method, it is found that with excitation only there is a critical value of the steady state excitatory conductance for repetitive spiking without noise and for values of the conductance near the critical value small noise has a powerfully inhibitory effect. For a given level of inhibition there is also a critical value of the steady state excitatory conductance for repetitive firing and it is demonstrated that noise either in the excitatory or inhibitory processes or both can powerfully inhibit spiking. Furthermore, near the critical value, inverse stochastic resonance was observed when noise was present only in the inhibitory input process. The system of 27 coupled deterministic differential equations for the approximate first and second order moments of the 6-dimensional model is derived. The moment differential equations are solved using Runge-Kutta methods and the solutions are compared with the results obtained by simulation for various sets of parameters including some with conductances obtained by experiment on pyramidal cells of rat prefrontal cortex. The mean and variance obtained from simulation are in good agreement when there is spiking induced by strong stimulation and relatively small noise or when the voltage is fluctuating at subthreshold levels. In the occasional spike mode sometimes exhibited by spinal motoneurons and cortical pyramidal cells the assunptions underlying the moment equation approach are not satisfied.
In the sensation of tones, visions and other stimuli, the surround inhibition mechanism (or lateral inhibition mechanism) is crucial. The mechanism enhances the signals of the strongest tone, color and other stimuli, by reducing and inhibiting the surrounding signals, since the latter signals are less important. This surround inhibition mechanism is well studied in the physiology of sensor systems. The neural network with two hidden layers in addition to input and output layers is constructed; having 60 neurons (units) in each of the four layers. The label (correct answer) is prepared from an input signal by applying seven times operations of the Hartline mechanism, that is, by sending inhibitory signals from the neighboring neurons and amplifying all the signals afterwards. The implication obtained by the deep learning of this neural network is compared with the standard physiological understanding of the surround inhibition mechanism.