No Arabic abstract
Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre-synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post-synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.
The brain is characterized by a strong heterogeneity of inhibitory neurons. We report that spiking neural networks display a resonance to the heterogeneity of inhibitory neurons, with optimal input/output responsiveness occurring for levels of heterogeneity similar to that found experimentally in cerebral cortex. A heterogeneous mean-field model predicts such optimal responsiveness. Moreover, we show that new dynamical regimes emerge from heterogeneity that were not present in the equivalent homogeneous system, such as sparsely synchronous collective oscillations.
Experimental and numerical results suggest that the brain can be viewed as a system acting close to a critical point, as confirmed by scale-free distributions of relevant quantities in a variety of different systems and models. Less attention has received the investigation of the temporal correlation functions in brain activity in different, healthy and pathological, conditions. Here we perform this analysis by means of a model with short and long-term plasticity which implements the novel feature of different recovery rates for excitatory and inhibitory neurons, found experimentally. We evidence the important role played by inhibitory neurons in the supercritical state: We detect an unexpected oscillatory behaviour of the correlation decay, whose frequency depends on the fraction of inhibitory neurons and their connectivity degree. This behaviour can be rationalized by the observation that bursts in activity become more frequent and with a smaller amplitude as inhibition becomes more relevant.
Oscillations are a hallmark of neural population activity in various brain regions with a spectrum covering a wide range of frequencies. Within this spectrum gamma oscillations have received particular attention due to their ubiquitous nature and to their correlation with higher brain functions. Recently, it has been reported that gamma oscillations in the hippocampus of behaving rodents are segregated in two distinct frequency bands: slow and fast. These two gamma rhythms correspond to dfferent states of the network, but their origin has been not yet clarified. Here, we show theoretically and numerically that a single inhibitory population can give rise to coexisting slow and fast gamma rhythms corresponding to collective oscillations of a balanced spiking network. The slow and fast gamma rhythms are generated via two different mechanisms: the fast one being driven by the coordinated tonic neural firing and the slow one by endogenous fluctuations due to irregular neural activity. We show that almost instantaneous stimulations can switch the collective gamma oscillations from slow to fast and vice versa. Furthermore, to make a closer contact with the experimental observations, we consider the modulation of the gamma rhythms induced by a slower (theta) rhythm driving the network dynamics. In this context, depending on the strength of the forcing, we observe phase-amplitude and phase-phase coupling between the fast and slow gamma oscillations and the theta forcing. Phase-phase coupling reveals different theta-phases preferences for the two coexisting gamma rhythms.
In this paper, we clarify the mechanisms underlying a general phenomenon present in pulse-coupled heterogeneous inhibitory networks: inhibition can induce not only suppression of the neural activity, as expected, but it can also promote neural reactivation. In particular, for globally coupled systems, the number of firing neurons monotonically reduces upon increasing the strength of inhibition (neurons death). However, the random pruning of the connections is able to reverse the action of inhibition, i.e. in a sparse network a sufficiently strong synaptic strength can surprisingly promote, rather than depress, the activity of the neurons (neurons rebirth). Thus the number of firing neurons reveals a minimum at some intermediate synaptic strength. We show that this minimum signals a transition from a regime dominated by the neurons with higher firing activity to a phase where all neurons are effectively sub-threshold and their irregular firing is driven by current fluctuations. We explain the origin of the transition by deriving an analytic mean field formulation of the problem able to provide the fraction of active neurons as well as the first two moments of their firing statistics. The introduction of a synaptic time scale does not modify the main aspects of the reported phenomenon. However, for sufficiently slow synapses the transition becomes dramatic, the system passes from a perfectly regular evolution to an irregular bursting dynamics. In this latter regime the model provides predictions consistent with experimental findings for a specific class of neurons, namely the medium spiny neurons in the striatum.
A major goal in neuroscience is to understand how populations of neurons code for stimuli or actions. While the number of neurons that can be recorded simultaneously is increasing at a fast pace, in most cases these recordings cannot access a complete population. In particular, it is hard to simultaneously record all the neurons of the same type in a given area. Recent progress have made possible to profile each recorded neuron in a given area thanks to genetic and physiological tools, and to pool together recordings from neurons of the same type across different experimental sessions. However, it is unclear how to infer the activity of a full population of neurons of the same type from these sequential recordings. Neural networks exhibit collective behaviour, e.g. noise correlations and synchronous activity, that are not directly captured by a conditionally-independent model that would just put together the spike trains from sequential recordings. Here we show that we can infer the activity of a full population of retina ganglion cells from sequential recordings, using a novel method based on copula distributions and maximum entropy modeling. From just the spiking response of each ganglion cell to a repeated stimulus, and a few pairwise recordings, we could predict the noise correlations using copulas, and then the full activity of a large population of ganglion cells of the same type using maximum entropy modeling. Remarkably, we could generalize to predict the population responses to different stimuli and even to different experiments. We could therefore use our method to construct a very large population merging cells responses from different experiments. We predicted synchronous activity accurately and showed it grew substantially with the number of neurons. This approach is a promising way to infer population activity from sequential recordings in sensory areas.