No Arabic abstract
Coupling among neural rhythms is one of the most important mechanisms at the basis of cognitive processes in the brain. In this study we consider a neural mass model, rigorously obtained from the microscopic dynamics of an inhibitory spiking network with exponential synapses, able to autonomously generate collective oscillations (COs). These oscillations emerge via a super-critical Hopf bifurcation, and their frequencies are controlled by the synaptic time scale, the synaptic coupling and the excitability of the neural population. Furthermore, we show that two inhibitory populations in a master-slave configuration with different synaptic time scales can display various collective dynamical regimes: namely, damped oscillations towards a stable focus, periodic and quasi-periodic oscillations, and chaos. Finally, when bidirectionally coupled the two inhibitory populations can exhibit different types of theta-gamma cross-frequency couplings (CFCs): namely, phase-phase and phase-amplitude CFC. The coupling between theta and gamma COs is enhanced in presence of a external theta forcing, reminiscent of the type of modulation induced in Hippocampal and Cortex circuits via optogenetic drive.
We study the stability and information encoding capacity of synchronized states in a neuronal network model that represents part of thalamic circuitry. Our model neurons have a Hodgkin-Huxley-type low threshold Calcium channel, display post inhibitory rebound, and are connected via GABAergic inhibitory synapses. We find that there is a threshold in synaptic strength, $tau_c$, below which there are no stable spiking network states. Above threshold the stable spiking state is a cluster state, where different groups of neurons fire consecutively, and each neuron fires with the same cluster each time. Weak noise destabilizes this state, but stronger noise drives the system into a different, self-organized, stochastically synchronized state. Neuronal firing is still organized in clusters, but individual neurons can hop from cluster to cluster. Noise can actually induce and sustain such a state below the threshold of synaptic strength. We do find a qualitative difference in the firing patterns between small ($sim 10$ neurons) and large ($sim 1000$ neurons) networks. We determine the information content of the spike trains in terms of two separate contributions: the spike time jitter around cluster firing times, and the hopping from cluster to cluster. We quantify the information loss due to temporally correlated interspike intervals. Recent experiments on the locust olfactory system and striatal neurons suggest that the nervous system may actually use these two channels to encode separate and unique information.
In this paper, we clarify the mechanisms underlying a general phenomenon present in pulse-coupled heterogeneous inhibitory networks: inhibition can induce not only suppression of the neural activity, as expected, but it can also promote neural reactivation. In particular, for globally coupled systems, the number of firing neurons monotonically reduces upon increasing the strength of inhibition (neurons death). However, the random pruning of the connections is able to reverse the action of inhibition, i.e. in a sparse network a sufficiently strong synaptic strength can surprisingly promote, rather than depress, the activity of the neurons (neurons rebirth). Thus the number of firing neurons reveals a minimum at some intermediate synaptic strength. We show that this minimum signals a transition from a regime dominated by the neurons with higher firing activity to a phase where all neurons are effectively sub-threshold and their irregular firing is driven by current fluctuations. We explain the origin of the transition by deriving an analytic mean field formulation of the problem able to provide the fraction of active neurons as well as the first two moments of their firing statistics. The introduction of a synaptic time scale does not modify the main aspects of the reported phenomenon. However, for sufficiently slow synapses the transition becomes dramatic, the system passes from a perfectly regular evolution to an irregular bursting dynamics. In this latter regime the model provides predictions consistent with experimental findings for a specific class of neurons, namely the medium spiny neurons in the striatum.
Rhythmogenesis, which is critical for many biological functions, involves a transition to coherent activity through cell-cell communication. In the absence of centralized coordination by specialized cells (pacemakers), competing oscillating clusters impede this global synchrony. We show that spatial symmetry-breaking through a frequency gradient results in the emergence of localized wave sources driving system-wide activity. Such gradients, arising through heterogeneous inter-cellular coupling, may explain directed rhythmic activity during labor in the uterus despite the absence of pacemakers.
Maximum Entropy models can be inferred from large data-sets to uncover how collective dynamics emerge from local interactions. Here, such models are employed to investigate neurons recorded by multielectrode arrays in the human and monkey cortex. Taking advantage of the separation of excitatory and inhibitory neuron types, we construct a model including this distinction. This approach allows to shed light upon differences between excitatory and inhibitory activity across different brain states such as wakefulness and deep sleep, in agreement with previous findings. Additionally, Maximum Entropy models can also unveil novel features of neuronal interactions, which are found to be dominated by pairwise interactions during wakefulness, but are population-wide during deep sleep. In particular, inhibitory neurons are observed to be strongly tuned to the inhibitory population. Overall, we demonstrate Maximum Entropy models can be useful to analyze data-sets with classified neuron types, and to reveal the respective roles of excitatory and inhibitory neurons in organizing coherent dynamics in the cerebral cortex.
This paper develops results for the next nearest neighbour Ising model on random graphs. Besides being an essential ingredient in classic models for frustrated systems, second neighbour interactions interactions arise naturally in several applications such as the colour diversity problem and graphical games. We demonstrate ensembles of random graphs, including regular connectivity graphs, that have a periodic variation of free energy, with either the ratio of nearest to next nearest couplings, or the mean number of nearest neighbours. When the coupling ratio is integer paramagnetic phases can be found at zero temperature. This is shown to be related to the locked or unlocked nature of the interactions. For anti-ferromagnetic couplings, spin glass phases are demonstrated at low temperature. The interaction structure is formulated as a factor graph, the solution on a tree is developed. The replica symmetric and energetic one-step replica symmetry breaking solution is developed using the cavity method. We calculate within these frameworks the phase diagram and demonstrate the existence of dynamical transitions at zero temperature for cases of anti-ferromagnetic coupling on regular and inhomogeneous random graphs.