Do you want to publish a course? Click here

Pattern formation in oscillatory complex networks consisting of excitable nodes

192   0   0.0 ( 0 )
 Added by Xuhong Liao
 Publication date 2010
  fields Physics Biology
and research's language is English




Ask ChatGPT about the research

Oscillatory dynamics of complex networks has recently attracted great attention. In this paper we study pattern formation in oscillatory complex networks consisting of excitable nodes. We find that there exist a few center nodes and small skeletons for most oscillations. Complicated and seemingly random oscillatory patterns can be viewed as well-organized target waves propagating from center nodes along the shortest paths, and the shortest loops passing through both the center nodes and their driver nodes play the role of oscillation sources. Analyzing simple skeletons we are able to understand and predict various essential properties of the oscillations and effectively modulate the oscillations. These methods and results will give insights into pattern formation in complex networks, and provide suggestive ideas for studying and controlling oscillations in neural networks.



rate research

Read More

Dynamical patterns in complex networks of coupled oscillators are both of theoretical and practical interest, yet to fully reveal and understand the interplay between pattern emergence and network structure remains to be an outstanding problem. A fundamental issue is the effect of network structure on the stability of the patterns. We address this issue by using the setting where random links are systematically added to a regular lattice and focusing on the dynamical evolution of spiral wave patterns. As the network structure deviates more from the regular topology (so that it becomes increasingly more complex), the original stable spiral wave pattern can disappear and a different type of pattern can emerge. Our main findings are the following. (1) Short-distance links added to a small region containing the spiral tip can have a more significant effect on the wave pattern than long-distance connections. (2) As more random links are introduced into the network, distinct pattern transitions can occur, which include the transition of spiral wave to global synchronization, to a chimera-like state, and then to a pinned spiral wave. (3) Around the transitions the network dynamics is highly sensitive to small variations in the network structure in the sense that the addition of even a single link can change the pattern from one type to another. These findings provide insights into the pattern dynamics in complex networks, a problem that is relevant to many physical, chemical, and biological systems.
We introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especially on synapses from excitatory pyramidal cells, in hippocampus and in sensory and cerebellar cortex. Here we study how such plasticity can be used to form memories and input representations when the neural dynamics are oscillatory, as is common in the brain (particularly in the hippocampus and olfactory cortex). Learning is assumed to occur in a phase of neural plasticity, in which the network is clamped to external teaching signals. By suitable manipulation of the nonlinearity of the neurons or of the oscillation frequencies during learning, the model can be made, in a retrieval phase, either to categorize new inputs or to map them, in a continuous fashion, onto the space spanned by the imprinted patterns. We identify the first of these possibilities with the function of olfactory cortex and the second with the observed response characteristics of place cells in hippocampus. We investigate both kinds of networks analytically and by computer simulations, and we link the models with experimental findings, exploring, in particular, how the spike timing dependence of the synaptic plasticity constrains the computational function of the network and vice versa.
The collective dynamics of a network of excitable nodes changes dramatically when inhibitory nodes are introduced. We consider inhibitory nodes which may be activated just like excitatory nodes but, upon activating, decrease the probability of activation of network neighbors. We show that, although the direct effect of inhibitory nodes is to decrease activity, the collective dynamics becomes self-sustaining. We explain this counterintuitive result by defining and analyzing a branching function which may be thought of as an activity-dependent branching ratio. The shape of the branching function implies that for a range of global coupling parameters dynamics are self-sustaining. Within the self-sustaining region of parameter space lies a critical line along which dynamics take the form of avalanches with universal scaling of size and duration, embedded in ceaseless timeseries of activity. Our analyses, confirmed by numerical simulation, suggest that inhibition may play a counterintuitive role in excitable networks.
We study the effect of varying wiring in excitable random networks in which connection weights change with activity to mold local resistance or facilitation due to fatigue. Dynamic attractors, corresponding to patterns of activity, are then easily destabilized according to three main modes, including one in which the activity shows chaotic hopping among the patterns. We describe phase transitions to this regime, and show a monotonous dependence of critical parameters on the heterogeneity of the wiring distribution. Such correlation between topology and functionality implies, in particular, that tasks which require unstable behavior --such as pattern recognition, family discrimination and categorization-- can be most efficiently performed on highly heterogeneous networks. It also follows a possible explanation for the abundance in nature of scale--free network topologies.
In this work we studied the combined action of chemical and electrical synapses in small networks of Hindmarsh-Rose (HR) neurons on the synchronous behaviour and on the rate of information produced (per time unit) by the networks. We show that if the chemical synapse is excitatory, the larger the chemical synapse strength used the smaller the electrical synapse strength needed to achieve complete synchronisation, and for moderate synaptic strengths one should expect to find desynchronous behaviour. Otherwise, if the chemical synapse is inhibitory, the larger the chemical synapse strength used the larger the electrical synapse strength needed to achieve complete synchronisation, and for moderate synaptic strengths one should expect to find synchronous behaviours. Finally, we show how to calculate semi-analytically an upper bound for the rate of information produced per time unit (Kolmogorov-Sinai entropy) in larger networks. As an application, we show that this upper bound is linearly proportional to the number of neurons in a network whose neurons are highly connected.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا