Do you want to publish a course? Click here

Optimal hierarchical modular topologies for producing limited sustained activation of neural networks

174   0   0.0 ( 0 )
 Added by Marcus Kaiser
 Publication date 2010
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

An essential requirement for the representation of functional patterns in complex neural networks, such as the mammalian cerebral cortex, is the existence of stable regimes of network activation, typically arising from a limited parameter range. In this range of limited sustained activity (LSA), the activity of neural populations in the network persists between the extremes of either quickly dying out or activating the whole network. Hierarchical modular networks were previously found to show a wider parameter range for LSA than random or small-world networks not possessing hierarchical organization or multiple modules. Here we explored how variation in the number of hierarchical levels and modules per level influenced network dynamics and occurrence of LSA. We tested hierarchical configurations of different network sizes, approximating the large-scale networks linking cortical columns in one hemisphere of the rat, cat, or macaque monkey brain. Scaling of the network size affected the number of hierarchical levels and modules in the optimal networks, also depending on whether global edge density or the numbers of connections per node were kept constant. For constant edge density, only few network configurations, possessing an intermediate number of levels and a large number of modules, led to a large range of LSA independent of brain size. For a constant number of node connections, there was a trend for optimal configurations in larger-size networks to possess a larger number of hierarchical levels or more modules. These results may help to explain the trend to greater network complexity apparent in larger brains and may indicate that this complexity is required for maintaining stable levels of neural activation.



rate research

Read More

It has recently been discovered that single neuron stimulation can impact network dynamics in immature and adult neuronal circuits. Here we report a novel mechanism which can explain in neuronal circuits, at an early stage of development, the peculiar role played by a few specific neurons in promoting/arresting the population activity. For this purpose, we consider a standard neuronal network model, with short-term synaptic plasticity, whose population activity is characterized by bursting behavior. The addition of developmentally inspired constraints and correlations in the distribution of the neuronal connectivities and excitabilities leads to the emergence of functional hub neurons, whose stimulation/deletion is critical for the network activity. Functional hubs form a clique, where a precise sequential activation of the neurons is essential to ignite collective events without any need for a specific topological architecture. Unsupervised time-lagged firings of supra-threshold cells, in connection with coordinated entrainments of near-threshold neurons, are the key ingredients to orchestrate
141 - Hyewon Kim , Meesoon Ha , 2017
We propose dynamic scaling in temporal networks with heterogeneous activities and memory, and provide a comprehensive picture for the dynamic topologies of such networks, in terms of the modified activity-driven network model [H. Kim textit{et al.}, Eur. Phys. J. B {bf 88}, 315 (2015)]. Particularly, we focus on the interplay of the time resolution and memory in dynamic topologies. Through the random walk (RW) process, we investigate diffusion properties and topological changes as the time resolution increases. Our results with memory are compared to those of the memoryless case. Based on the temporal percolation concept, we derive scaling exponents in the dynamics of the largest cluster and the coverage of the RW process in time-varying networks. We find that the time resolution in the time-accumulated network determines the effective size of the network, while memory affects relevant scaling properties at the crossover from the dynamic regime to the static one. The origin of memory-dependent scaling behaviors is the dynamics of the largest cluster, which depends on temporal degree distributions. Finally, we conjecture of the extended finite-size scaling ansatz for dynamic topologies and the fundamental property of temporal networks, which are numerically confirmed.
We consider a sparse random network of excitatory leaky integrate-and-fire neurons with short-term synaptic depression. Furthermore to mimic the dynamics of a brain circuit in its first stages of development we introduce for each neuron correlations among in-degree and out-degree as well as among excitability and the corresponding total degree, We analyze the influence of single neuron stimulation and deletion on the collective dynamics of the network. We show the existence of a small group of neurons capable of controlling and even silencing the bursting activity of the network. These neurons form a functional clique since only their activation in a precise order and within specific time windows is capable to ignite population bursts.
We study the storage of multiple phase-coded patterns as stable dynamical attractors in recurrent neural networks with sparse connectivity. To determine the synaptic strength of existent connections and store the phase-coded patterns, we introduce a learning rule inspired to the spike-timing dependent plasticity (STDP). We find that, after learning, the spontaneous dynamics of the network replay one of the stored dynamical patterns, depending on the network initialization. We study the network capacity as a function of topology, and find that a small- world-like topology may be optimal, as a compromise between the high wiring cost of long range connections and the capacity increase.
In this paper, we clarify the mechanisms underlying a general phenomenon present in pulse-coupled heterogeneous inhibitory networks: inhibition can induce not only suppression of the neural activity, as expected, but it can also promote neural reactivation. In particular, for globally coupled systems, the number of firing neurons monotonically reduces upon increasing the strength of inhibition (neurons death). However, the random pruning of the connections is able to reverse the action of inhibition, i.e. in a sparse network a sufficiently strong synaptic strength can surprisingly promote, rather than depress, the activity of the neurons (neurons rebirth). Thus the number of firing neurons reveals a minimum at some intermediate synaptic strength. We show that this minimum signals a transition from a regime dominated by the neurons with higher firing activity to a phase where all neurons are effectively sub-threshold and their irregular firing is driven by current fluctuations. We explain the origin of the transition by deriving an analytic mean field formulation of the problem able to provide the fraction of active neurons as well as the first two moments of their firing statistics. The introduction of a synaptic time scale does not modify the main aspects of the reported phenomenon. However, for sufficiently slow synapses the transition becomes dramatic, the system passes from a perfectly regular evolution to an irregular bursting dynamics. In this latter regime the model provides predictions consistent with experimental findings for a specific class of neurons, namely the medium spiny neurons in the striatum.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا