Do you want to publish a course? Click here

Average synaptic activity and neural networks topology: a global inverse problem

145   0   0.0 ( 0 )
 Added by Matteo di Volo
 Publication date 2013
  fields Physics
and research's language is English




Ask ChatGPT about the research

The dynamics of neural networks is often characterized by collective behavior and quasi-synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are crucial for brain functioning. They strongly depend on the topology of the network and on the fluctuations of the connectivity. We propose a heterogeneous mean--field approach to neural dynamics on random networks, that explicitly preserves the disorder in the topology at growing network sizes, and leads to a set of self-consistent equations. Within this approach, we provide an effective description of microscopic and large scale temporal signals in a leaky integrate-and-fire model with short term plasticity, where quasi-synchronous events arise. Our equations provide a clear analytical picture of the dynamics, evidencing the contributions of both periodic (locked) and aperiodic (unlocked) neurons to the measurable average signal. In particular, we formulate and solve a global inverse problem of reconstructing the in-degree distribution from the knowledge of the average activity field. Our method is very general and applies to a large class of dynamical models on dense random networks.



rate research

Read More

121 - D. Bolle , R. Heylen 2007
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solved numerically. In both cases it is shown that, if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed. This self-control mechanism considerably improves the quality of retrieval, in particular the storage capacity, the basins of attraction and the mutual information content.
63 - D. Bolle , R. Heylen 2006
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of layered feedforward neural network models with synaptic noise. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed.This self-control mechanism considerably improves the quality of retrieval, in particular the storage capacity, the basins of attraction and the mutual information content.
Inverse phase transitions are striking phenomena in which an apparently more ordered state disorders under cooling. This behavior can naturally emerge in tricritical systems on heterogeneous networks and it is strongly enhanced by the presence of disassortative degree correlations. We show it both analytically and numerically, providing also a microscopic interpretation of inverse transitions in terms of freezing of sparse subgraphs and coupling renormalization.
Networks with fat-tailed degree distributions are omnipresent across many scientific disciplines. Such systems are characterized by so-called hubs, specific nodes with high numbers of connections to other nodes. By this property, they are expected to be key to the collective network behavior, e.g., in Ising models on such complex topologies. This applies in particular to the transition into a globally ordered network state, which thereby proceeds in a hierarchical fashion, and with a non-trivial local structure. Standard mean-field theory of Ising models on scale-free networks underrates the presence of the hubs, while nevertheless providing remarkably reliable estimates for the onset of global order. Here, we expose that a spurious self-feedback effect, inherent to mean-field theory, underlies this apparent paradox. More specifically, we demonstrate that higher order interaction effects precisely cancel the self-feedback on the hubs, and we expose the importance of hubs for the distinct onset of local versus global order in the network. Due to the generic nature of our arguments, we expect the mechanism that we uncover for the archetypal case of Ising networks of the Barabasi-Albert type to be also relevant for other systems with a strongly hierarchical underlying network structure.
65 - D. Bolle , R. Heylen 2004
For the retrieval dynamics of sparsely coded attractor associative memory models with synaptic noise the inclusion of a macroscopic time-dependent threshold is studied. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the memorized patterns, adapting itself automatically in the course of the time evolution, an autonomous functioning of the model is guaranteed. This self-control mechanism considerably improves the quality of the fixed-point retrieval dynamics, in particular the storage capacity, the basins of attraction and the mutual information content.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا