Do you want to publish a course? Click here

Complex synchronous behavior in interneuronal networks with delayed inhibitory and fast electrical synapses

496   0   0.0 ( 0 )
 Added by Matjaz Perc
 Publication date 2012
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

Networks of fast-spiking interneurons are crucial for the generation of neural oscillations in the brain. Here we study the synchronous behavior of interneuronal networks that are coupled by delayed inhibitory and fast electrical synapses. We find that both coupling modes play a crucial role by the synchronization of the network. In addition, delayed inhibitory synapses affect the emerging oscillatory patterns. By increasing the inhibitory synaptic delay, we observe a transition from regular to mixed oscillatory patterns at a critical value. We also examine how the unreliability of inhibitory synapses influences the emergence of synchronization and the oscillatory patterns. We find that low levels of reliability tend to destroy synchronization, and moreover, that interneuronal networks with long inhibitory synaptic delays require a minimal level of reliability for the mixed oscillatory pattern to be maintained.



rate research

Read More

It is widely appreciated that well-balanced excitation and inhibition are necessary for proper function in neural networks. However, in principle, such balance could be achieved by many possible configurations of excitatory and inhibitory strengths, and relative numbers of excitatory and inhibitory neurons. For instance, a given level of excitation could be balanced by either numerous inhibitory neurons with weak synapses, or few inhibitory neurons with strong synapses. Among the continuum of different but balanced configurations, why should any particular configuration be favored? Here we address this question in the context of the entropy of network dynamics by studying an analytically tractable network of binary neurons. We find that entropy is highest at the boundary between excitation-dominant and inhibition-dominant regimes. Entropy also varies along this boundary with a trade-off between high and robust entropy: weak synapse strengths yield high network entropy which is fragile to parameter variations, while strong synapse strengths yield a lower, but more robust, network entropy. In the case where inhibitory and excitatory synapses are constrained to have similar strength, we find that a small, but non-zero fraction of inhibitory neurons, like that seen in mammalian cortex, results in robust and relatively high entropy.
Conventionally, information is represented by spike rates in the neural system. Here, we consider the ability of temporally modulated activities in neuronal networks to carry information extra to spike rates. These temporal modulations, commonly known as population spikes, are due to the presence of synaptic depression in a neuronal network model. We discuss its relevance to an experiment on transparent motions in macaque monkeys by Treue et al. in 2000. They found that if the moving directions of objects are too close, the firing rate profile will be very similar to that with one direction. As the difference in the moving directions of objects is large enough, the neuronal system would respond in such a way that the network enhances the resolution in the moving directions of the objects. In this paper, we propose that this behavior can be reproduced by neural networks with dynamical synapses when there are multiple external inputs. We will demonstrate how resolution enhancement can be achieved, and discuss the conditions under which temporally modulated activities are able to enhance information processing performances in general.
Collective oscillations and their suppression by external stimulation are analyzed in a large-scale neural network consisting of two interacting populations of excitatory and inhibitory quadratic integrate-and-fire neurons. In the limit of an infinite number of neurons, the microscopic model of this network can be reduced to an exact low-dimensional system of mean-field equations. Bifurcation analysis of these equations reveals three different dynamic modes in a free network: a stable resting state, a stable limit cycle, and bistability with a coexisting resting state and a limit cycle. We show that in the limit cycle mode, high-frequency stimulation of an inhibitory population can stabilize an unstable resting state and effectively suppress collective oscillations. We also show that in the bistable mode, the dynamics of the network can be switched from a stable limit cycle to a stable resting state by applying an inhibitory pulse to the excitatory population. The results obtained from the mean-field equations are confirmed by numerical simulation of the microscopic model.
A great deal of research has been devoted on the investigation of neural dynamics in various network topologies. However, only a few studies have focused on the influence of autapses, synapses from a neuron onto itself via closed loops, on neural synchronisation. Here, we build a random network with adaptive exponential integrate-and-fire neurons coupled with chemical synapses, equipped with autapses, to study the effect of the latter on synchronous behaviour. We consider time delay in the conductance of the pre-synaptic neuron for excitatory and inhibitory connections. Interestingly, in neural networks consisting of both excitatory and inhibitory neurons, we uncover that synchronous behaviour depends on their synapse type. Our results provide evidence on the synchronous and desynchronous activities that emerge in random neural networks with chemical, inhibitory and excitatory synapses where neurons are equipped with autapses.
In this paper, we clarify the mechanisms underlying a general phenomenon present in pulse-coupled heterogeneous inhibitory networks: inhibition can induce not only suppression of the neural activity, as expected, but it can also promote neural reactivation. In particular, for globally coupled systems, the number of firing neurons monotonically reduces upon increasing the strength of inhibition (neurons death). However, the random pruning of the connections is able to reverse the action of inhibition, i.e. in a sparse network a sufficiently strong synaptic strength can surprisingly promote, rather than depress, the activity of the neurons (neurons rebirth). Thus the number of firing neurons reveals a minimum at some intermediate synaptic strength. We show that this minimum signals a transition from a regime dominated by the neurons with higher firing activity to a phase where all neurons are effectively sub-threshold and their irregular firing is driven by current fluctuations. We explain the origin of the transition by deriving an analytic mean field formulation of the problem able to provide the fraction of active neurons as well as the first two moments of their firing statistics. The introduction of a synaptic time scale does not modify the main aspects of the reported phenomenon. However, for sufficiently slow synapses the transition becomes dramatic, the system passes from a perfectly regular evolution to an irregular bursting dynamics. In this latter regime the model provides predictions consistent with experimental findings for a specific class of neurons, namely the medium spiny neurons in the striatum.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا