ترغب بنشر مسار تعليمي؟ اضغط هنا

Synchronization in random balanced networks

380   0   0.0 ( 0 )
 نشر من قبل Luis Carlos Garc\\'ia del Molino
 تاريخ النشر 2013
  مجال البحث فيزياء علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Characterizing the in uence of network properties on the global emerging behavior of interacting elements constitutes a central question in many areas, from physical to social sciences. In this article we study a primary model of disordered neuronal networks with excitatory-inhibitory structure and balance constraints. We show how the interplay between structure and disorder in the connectivity leads to a universal transition from trivial to synchronized stationary or periodic states. This transition cannot be explained only through the analysis of the spectral density of the connectivity matrix. We provide a low dimensional approximation that shows the role of both the structure and disorder in the dynamics.



قيم البحث

اقرأ أيضاً

We investigated the influence of efficacy of synaptic interaction on firing synchronization in excitatory neuronal networks. We found spike death phenomena, namely, the state of neurons transits from limit cycle to fixed point or transient state. The phenomena occur under the perturbation of excitatory synaptic interaction that has a high efficacy. We showed that the decrease of synaptic current results in spike death through depressing the feedback of sodium ionic current. In the networks with spike death property the degree of synchronization is lower and unsensitive to the heterogeneity of neurons. The mechanism of the influence is that the transition of neuron state disrupts the adjustment of the rhythm of neuron oscillation and prevents further increase of firing synchronization.
We investigate the dynamical role of inhibitory and highly connected nodes (hub) in synchronization and input processing of leaky-integrate-and-fire neural networks with short term synaptic plasticity. We take advantage of a heterogeneous mean-field approximation to encode the role of network structure and we tune the fraction of inhibitory neurons $f_I$ and their connectivity level to investigate the cooperation between hub features and inhibition. We show that, depending on $f_I$, highly connected inhibitory nodes strongly drive the synchronization properties of the overall network through dynamical transitions from synchronous to asynchronous regimes. Furthermore, a metastable regime with long memory of external inputs emerges for a specific fraction of hub inhibitory neurons, underlining the role of inhibition and connectivity also for input processing in neural networks.
Cortical neural circuits display highly irregular spiking in individual neurons but variably sized collective firing, oscillations and critical avalanches at the population level, all of which have functional importance for information processing. Th eoretically, the balance of excitation and inhibition inputs is thought to account for spiking irregularity and critical avalanches may originate from an underlying phase transition. However, the theoretical reconciliation of these multilevel dynamic aspects in neural circuits remains an open question. Herein, we study excitation-inhibition (E-I) balanced neuronal network with biologically realistic synaptic kinetics. It can maintain irregular spiking dynamics with different levels of synchrony and critical avalanches emerge near the synchronous transition point. We propose a novel semi-analytical mean-field theory to derive the field equations governing the network macroscopic dynamics. It reveals that the E-I balanced state of the network manifesting irregular individual spiking is characterized by a macroscopic stable state, which can be either a fixed point or a periodic motion and the transition is predicted by a Hopf bifurcation in the macroscopic field. Furthermore, by analyzing public data, we find the coexistence of irregular spiking and critical avalanches in the spontaneous spiking activities of mouse cortical slice in vitro, indicating the universality of the observed phenomena. Our theory unveils the mechanism that permits complex neural activities in different spatiotemporal scales to coexist and elucidates a possible origin of the criticality of neural systems. It also provides a novel tool for analyzing the macroscopic dynamics of E-I balanced networks and its relationship to the microscopic counterparts, which can be useful for large-scale modeling and computation of cortical dynamics.
While most models of randomly connected networks assume nodes with simple dynamics, nodes in realistic highly connected networks, such as neurons in the brain, exhibit intrinsic dynamics over multiple timescales. We analyze how the dynamical properti es of nodes (such as single neurons) and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate units shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single units. For the case of two-dimensional rate units with strong adaptation, we find that the network exhibits a state of resonant chaos, characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated units, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single nodes, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic network models in the brain, or, more generally, networks of other physical or man-made complex dynamical units.
We investigate the dynamics of two models of biological networks with purely suppressive interactions between the units; species interacting via niche competition and neurons via inhibitory synaptic coupling. In both of these cases, power-law scaling of the density of states with probability arises without any fine-tuning of the model parameters. These results argue against the increasingly popular notion that non-equilibrium living systems operate at special critical points, driven by there by evolution so as to enable adaptive processing of input data.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا