No Arabic abstract
The $1/f$-like decay observed in the power spectrum of electro-physiological signals, along with scale-free statistics of the so-called neuronal avalanches, constitute evidences of criticality in neuronal systems. Recent in vitro studies have shown that avalanche dynamics at criticality corresponds to some specific balance of excitation and inhibition, thus suggesting that this is a basic feature of the critical state of neuronal networks. In particular, a lack of inhibition significantly alters the temporal structure of the spontaneous avalanche activity and leads to an anomalous abundance of large avalanches. Here we study the relationship between network inhibition and the scaling exponent $beta$ of the power spectral density (PSD) of avalanche activity in a neuronal network model inspired in Self-Organized Criticality (SOC). We find that this scaling exponent depends on the percentage of inhibitory synapses and tends to the value $beta = 1$ for a percentage of about 30%. More specifically, $beta$ is close to $2$, namely brownian noise, for purely excitatory networks and decreases towards values in the interval $[1,1.4]$ as the percentage of inhibitory synapses ranges between 20 and 30%, in agreement with experimental findings. These results indicate that the level of inhibition affects the frequency spectrum of resting brain activity and suggest the analysis of the PSD scaling behavior as a possible tool to study pathological conditions.
We consider two neuronal networks coupled by long-range excitatory interactions. Oscillations in the gamma frequency band are generated within each network by local inhibition. When long-range excitation is weak, these oscillations phase-lock with a phase-shift dependent on the strength of local inhibition. Increasing the strength of long-range excitation induces a transition to chaos via period-doubling or quasi-periodic scenarios. In the chaotic regime oscillatory activity undergoes fast temporal decorrelation. The generality of these dynamical properties is assessed in firing-rate models as well as in large networks of conductance-based neurons.
The output signal is examined for the Jacobi neuronal model which is characterized by input-dependent multiplicative noise. The dependence of the noise on the rate of inhibition turns out to be of primary importance to observe maxima both in the output firing rate and in the diffusion coefficient of the spike count and, simultaneously, a minimum in the coefficient of variation (Fano factor). Moreover, we observe that an increment of the rate of inhibition can increase the degree of coherence computed from the power spectrum. This means that inhibition can enhance the coherence and thus the information transmission between the input and the output in this neuronal model. Finally, we stress that the firing rate, the coefficient of variation and the diffusion coefficient of the spike count cannot be used as the only indicator of coherence resonance without considering the power spectrum.
Neuronal networks are controlled by a combination of the dynamics of individual neurons and the connectivity of the network that links them together. We study a minimal model of the preBotzinger complex, a small neuronal network that controls the breathing rhythm of mammals through periodic firing bursts. We show that the properties of a such a randomly connected network of identical excitatory neurons are fundamentally different from those of uniformly connected neuronal networks as described by mean-field theory. We show that (i) the connectivity properties of the networks determines the location of emergent pacemakers that trigger the firing bursts and (ii) that the collective desensitization that terminates the firing bursts is determined again by the network connectivity, through k-core clusters of neurons.
The collective dynamics of a network of excitable nodes changes dramatically when inhibitory nodes are introduced. We consider inhibitory nodes which may be activated just like excitatory nodes but, upon activating, decrease the probability of activation of network neighbors. We show that, although the direct effect of inhibitory nodes is to decrease activity, the collective dynamics becomes self-sustaining. We explain this counterintuitive result by defining and analyzing a branching function which may be thought of as an activity-dependent branching ratio. The shape of the branching function implies that for a range of global coupling parameters dynamics are self-sustaining. Within the self-sustaining region of parameter space lies a critical line along which dynamics take the form of avalanches with universal scaling of size and duration, embedded in ceaseless timeseries of activity. Our analyses, confirmed by numerical simulation, suggest that inhibition may play a counterintuitive role in excitable networks.
Most nervous systems encode information about stimuli in the responding activity of large neuronal networks. This activity often manifests itself as dynamically coordinated sequences of action potentials. Since multiple electrode recordings are now a standard tool in neuroscience research, it is important to have a measure of such network-wide behavioral coordination and information sharing, applicable to multiple neural spike train data. We propose a new statistic, informational coherence, which measures how much better one unit can be predicted by knowing the dynamical state of another. We argue informational coherence is a measure of association and shared information which is superior to traditional pairwise measures of synchronization and correlation. To find the dynamical states, we use a recently-introduced algorithm which reconstructs effective state spaces from stochastic time series. We then extend the pairwise measure to a multivariate analysis of the network by estimating the network multi-information. We illustrate our method by testing it on a detailed model of the transition from gamma to beta rhythms.