ترغب بنشر مسار تعليمي؟ اضغط هنا

How single neuron properties shape chaotic dynamics and signal transmission in random neural networks

246   0   0.0 ( 0 )
 نشر من قبل Samuel Muscinelli
 تاريخ النشر 2018
  مجال البحث فيزياء علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

While most models of randomly connected networks assume nodes with simple dynamics, nodes in realistic highly connected networks, such as neurons in the brain, exhibit intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of nodes (such as single neurons) and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate units shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single units. For the case of two-dimensional rate units with strong adaptation, we find that the network exhibits a state of resonant chaos, characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated units, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single nodes, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic network models in the brain, or, more generally, networks of other physical or man-made complex dynamical units.



قيم البحث

اقرأ أيضاً

We investigate the dynamics of continuous attractor neural networks (CANNs). Due to the translational invariance of their neuronal interactions, CANNs can hold a continuous family of stationary states. We systematically explore how their neutral stab ility facilitates the tracking performance of a CANN, which is believed to have wide applications in brain functions. We develop a perturbative approach that utilizes the dominant movement of the network stationary states in the state space. We quantify the distortions of the bump shape during tracking, and study their effects on the tracking performance. Results are obtained on the maximum speed for a moving stimulus to be trackable, and the reaction time to catch up an abrupt change in stimulus.
Characterizing the in uence of network properties on the global emerging behavior of interacting elements constitutes a central question in many areas, from physical to social sciences. In this article we study a primary model of disordered neuronal networks with excitatory-inhibitory structure and balance constraints. We show how the interplay between structure and disorder in the connectivity leads to a universal transition from trivial to synchronized stationary or periodic states. This transition cannot be explained only through the analysis of the spectral density of the connectivity matrix. We provide a low dimensional approximation that shows the role of both the structure and disorder in the dynamics.
We introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especi ally on synapses from excitatory pyramidal cells, in hippocampus and in sensory and cerebellar cortex. Here we study how such plasticity can be used to form memories and input representations when the neural dynamics are oscillatory, as is common in the brain (particularly in the hippocampus and olfactory cortex). Learning is assumed to occur in a phase of neural plasticity, in which the network is clamped to external teaching signals. By suitable manipulation of the nonlinearity of the neurons or of the oscillation frequencies during learning, the model can be made, in a retrieval phase, either to categorize new inputs or to map them, in a continuous fashion, onto the space spanned by the imprinted patterns. We identify the first of these possibilities with the function of olfactory cortex and the second with the observed response characteristics of place cells in hippocampus. We investigate both kinds of networks analytically and by computer simulations, and we link the models with experimental findings, exploring, in particular, how the spike timing dependence of the synaptic plasticity constrains the computational function of the network and vice versa.
We investigate the dynamical role of inhibitory and highly connected nodes (hub) in synchronization and input processing of leaky-integrate-and-fire neural networks with short term synaptic plasticity. We take advantage of a heterogeneous mean-field approximation to encode the role of network structure and we tune the fraction of inhibitory neurons $f_I$ and their connectivity level to investigate the cooperation between hub features and inhibition. We show that, depending on $f_I$, highly connected inhibitory nodes strongly drive the synchronization properties of the overall network through dynamical transitions from synchronous to asynchronous regimes. Furthermore, a metastable regime with long memory of external inputs emerges for a specific fraction of hub inhibitory neurons, underlining the role of inhibition and connectivity also for input processing in neural networks.
We study a network of spiking neurons with heterogeneous excitabilities connected via inhibitory delayed pulses. For globally coupled systems the increase of the inhibitory coupling reduces the number of firing neurons by following a Winner Takes All mechanism. For sufficiently large transmission delay we observe the emergence of collective oscillations in the system beyond a critical coupling value. Heterogeneity promotes neural inactivation and asynchronous dynamics and its effect can be counteracted by considering longer time delays. In sparse networks, inhibition has the counterintuitive effect of promoting neural reactivation of silent neurons for sufficiently large coupling. In this regime, current fluctuations are on one side responsible for neural firing of sub-threshold neurons and on the other side for their desynchronization. Therefore, collective oscillations are present only in a limited range of coupling values, which remains finite in the thermodynamic limit. Out of this range the dynamics is asynchronous and for very large inhibition neurons display a bursting behaviour alternating periods of silence with periods where they fire freely in absence of any inhibition.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا