Do you want to publish a course? Click here

A tractable method for describing complex couplings between neurons and population rate

89   0   0.0 ( 0 )
 Added by Thierry Mora
 Publication date 2016
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

Neurons within a population are strongly correlated, but how to simply capture these correlations is still a matter of debate. Recent studies have shown that the activity of each cell is influenced by the population rate, defined as the summed activity of all neurons in the population. However, an explicit, tractable model for these interactions is still lacking. Here we build a probabilistic model of population activity that reproduces the firing rate of each cell, the distribution of the population rate, and the linear coupling between them. This model is tractable, meaning that its parameters can be learned in a few seconds on a standard computer even for large population recordings. We inferred our model for a population of 160 neurons in the salamander retina. In this population, single-cell firing rates depended in unexpected ways on the population rate. In particular, some cells had a preferred population rate at which they were most likely to fire. These complex dependencies could not be explained by a linear coupling between the cell and the population rate. We designed a more general, still tractable model that could fully account for these non-linear dependencies. We thus provide a simple and computationally tractable way to learn models that reproduce the dependence of each neuron on the population rate.

rate research

Read More

Oscillations are a hallmark of neural population activity in various brain regions with a spectrum covering a wide range of frequencies. Within this spectrum gamma oscillations have received particular attention due to their ubiquitous nature and to their correlation with higher brain functions. Recently, it has been reported that gamma oscillations in the hippocampus of behaving rodents are segregated in two distinct frequency bands: slow and fast. These two gamma rhythms correspond to dfferent states of the network, but their origin has been not yet clarified. Here, we show theoretically and numerically that a single inhibitory population can give rise to coexisting slow and fast gamma rhythms corresponding to collective oscillations of a balanced spiking network. The slow and fast gamma rhythms are generated via two different mechanisms: the fast one being driven by the coordinated tonic neural firing and the slow one by endogenous fluctuations due to irregular neural activity. We show that almost instantaneous stimulations can switch the collective gamma oscillations from slow to fast and vice versa. Furthermore, to make a closer contact with the experimental observations, we consider the modulation of the gamma rhythms induced by a slower (theta) rhythm driving the network dynamics. In this context, depending on the strength of the forcing, we observe phase-amplitude and phase-phase coupling between the fast and slow gamma oscillations and the theta forcing. Phase-phase coupling reveals different theta-phases preferences for the two coexisting gamma rhythms.
Correlations in sensory neural networks have both extrinsic and intrinsic origins. Extrinsic or stimulus correlations arise from shared inputs to the network, and thus depend strongly on the stimulus ensemble. Intrinsic or noise correlations reflect biophysical mechanisms of interactions between neurons, which are expected to be robust to changes of the stimulus ensemble. Despite the importance of this distinction for understanding how sensory networks encode information collectively, no method exists to reliably separate intrinsic interactions from extrinsic correlations in neural activity data, limiting our ability to build predictive models of the network response. In this paper we introduce a general strategy to infer {population models of interacting neurons that collectively encode stimulus information}. The key to disentangling intrinsic from extrinsic correlations is to infer the {couplings between neurons} separately from the encoding model, and to combine the two using corrections calculated in a mean-field approximation. We demonstrate the effectiveness of this approach on retinal recordings. The same coupling network is inferred from responses to radically different stimulus ensembles, showing that these couplings indeed reflect stimulus-independent interactions between neurons. The inferred model predicts accurately the collective response of retinal ganglion cell populations as a function of the stimulus.
The brain is characterized by a strong heterogeneity of inhibitory neurons. We report that spiking neural networks display a resonance to the heterogeneity of inhibitory neurons, with optimal input/output responsiveness occurring for levels of heterogeneity similar to that found experimentally in cerebral cortex. A heterogeneous mean-field model predicts such optimal responsiveness. Moreover, we show that new dynamical regimes emerge from heterogeneity that were not present in the equivalent homogeneous system, such as sparsely synchronous collective oscillations.
Experimental and numerical results suggest that the brain can be viewed as a system acting close to a critical point, as confirmed by scale-free distributions of relevant quantities in a variety of different systems and models. Less attention has received the investigation of the temporal correlation functions in brain activity in different, healthy and pathological, conditions. Here we perform this analysis by means of a model with short and long-term plasticity which implements the novel feature of different recovery rates for excitatory and inhibitory neurons, found experimentally. We evidence the important role played by inhibitory neurons in the supercritical state: We detect an unexpected oscillatory behaviour of the correlation decay, whose frequency depends on the fraction of inhibitory neurons and their connectivity degree. This behaviour can be rationalized by the observation that bursts in activity become more frequent and with a smaller amplitude as inhibition becomes more relevant.
Cable theory has been developed over the last decades, usually assuming that the extracellular space around membranes is a perfect resistor. However, extracellular media may display more complex electrical properties due to various phenomena, such as polarization, ionic diffusion or capacitive effects, but their impact on cable properties is not known. In this paper, we generalize cable theory for membranes embedded in arbitrarily complex extracellular media. We outline the generalized cable equations, then consider specific cases. The simplest case is a resistive medium, in which case the equations recover the traditional cable equations. We show that for more complex media, for example in the presence of ionic diffusion, the impact on cable properties such as voltage attenuation can be significant. We illustrate this numerically always by comparing the generalized cable to the traditional cable. We conclude that the nature of intracellular and extracellular media may have a strong influence on cable filtering as well as on the passive integrative properties of neurons.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا