Do you want to publish a course? Click here

Synchronization in networks with heterogeneous adaptation rules and applications to distance-dependent synaptic plasticity

347   0   0.0 ( 0 )
 Added by Rico Berner
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

This work introduces a methodology for studying synchronization in adaptive networks with heterogeneous plasticity (adaptation) rules. As a paradigmatic model, we consider a network of adaptively coupled phase oscillators with distance-dependent adaptations. For this system, we extend the master stability function approach to adaptive networks with heterogeneous adaptation. Our method allows for separating the contributions of network structure, local node dynamics, and heterogeneous adaptation in determining synchronization. Utilizing our proposed methodology, we explain mechanisms leading to synchronization or desynchronization by enhanced long-range connections in nonlocally coupled ring networks and networks with Gaussian distance-dependent coupling weights equipped with a biologically motivated plasticity rule.



rate research

Read More

We investigate the dynamical role of inhibitory and highly connected nodes (hub) in synchronization and input processing of leaky-integrate-and-fire neural networks with short term synaptic plasticity. We take advantage of a heterogeneous mean-field approximation to encode the role of network structure and we tune the fraction of inhibitory neurons $f_I$ and their connectivity level to investigate the cooperation between hub features and inhibition. We show that, depending on $f_I$, highly connected inhibitory nodes strongly drive the synchronization properties of the overall network through dynamical transitions from synchronous to asynchronous regimes. Furthermore, a metastable regime with long memory of external inputs emerges for a specific fraction of hub inhibitory neurons, underlining the role of inhibition and connectivity also for input processing in neural networks.
Rhythmic activity has been associated with a wide range of cognitive processes. Previous studies have shown that spike-timing-dependent plasticity can facilitate the transfer of rhythmic activity downstream the information processing pathway. However, STDP has also been known to generate strong winner-take-all like competitions between subgroups of correlated synaptic inputs. Consequently, one might expect that STDP would induce strong competition between different rhythmicity channels thus preventing the multiplexing of information across different frequency channels. This study explored whether STDP facilitates the multiplexing of information across multiple frequency channels, and if so, under what conditions. We investigated the STDP dynamics in the framework of a model consisting of two competing subpopulations of neurons that synapse in a feedforward manner onto a single postsynaptic neuron. Each sub-population was assumed to oscillate in an independent manner and in a different frequency band. To investigate the STDP dynamics, a mean field Fokker-Planck theory was developed in the limit of the slow learning rate. Surprisingly, our theory predicted limited interactions between the different sub-groups. Our analysis further revealed that the interaction between these channels was mainly mediated by the shared component of the mean activity. Next, we generalized these results beyond the simplistic model using numerical simulations. We found that for a wide range of parameters, the system converged to a solution in which the post-synaptic neuron responded to both rhythms. Nevertheless, all the synaptic weights remained dynamic and did not converge to a fixed point. These findings imply that STDP can support the multiplexing of rhythmic information and demonstrate how functionality can be retained in the face of continuous remodeling of all the synaptic weights.
We show that the local Spike Timing-Dependent Plasticity (STDP) rule has the effect of regulating the trans-synaptic weights of loops of any length within a simulated network of neurons. We show that depending on STDPs polarity, functional loops are formed or eliminated in networks driven to normal spiking conditions by random, partially correlated inputs, where functional loops comprise weights that exceed a non-zero threshold. We further prove that STDP is a form of loop-regulating plasticity for the case of a linear network comprising random weights drawn from certain distributions. Thus a notable local synaptic learning rule makes a specific prediction about synapses in the brain in which standard STDP is present: that under normal spiking conditions, they should participate in predominantly feed-forward connections at all scales. Our model implies that any deviations from this prediction would require a substantial modification to the hypothesized role for standard STDP. Given its widespread occurrence in the brain, we predict that STDP could also regulate long range synaptic loops among individual neurons across all brain scales, up to, and including, the scale of global brain network topology.
Latency reduction of postsynaptic spikes is a well-known effect of Synaptic Time-Dependent Plasticity. We expand this notion for long postsynaptic spike trains, showing that, for a fixed input spike train, STDP reduces the number of postsynaptic spikes and concentrates the remaining ones. Then we study the consequences of this phenomena in terms of coding, finding that this mechanism improves the neural code by increasing the signal-to-noise ratio and lowering the metabolic costs of frequent stimuli. Finally, we illustrate that the reduction of postsynaptic latencies can lead to the emergence of predictions.
We report a detailed analysis on the emergence of bursting in a recently developed neural mass model that takes short-term synaptic plasticity into account. The one being used here is particularly important, as it represents an exact meanfield limit of synaptically coupled quadratic integrate & fire neurons, a canonical model for type I excitability. In absence of synaptic dynamics, a periodic external current with a slow frequency {epsilon} can lead to burst-like dynamics. The firing patterns can be understood using techniques of singular perturbation theory, specifically slow-fast dissection. In the model with synaptic dynamics the separation of timescales leads to a variety of slow-fast phenomena and their role for bursting is rendered inordinately more intricate. Canards are one of the main slow-fast elements on the route to bursting. They describe trajectories evolving nearby otherwise repelling locally invariant sets of the system and are found in the transition region from subthreshold dynamics to bursting. For values of the timescale separation nearby the singular limit {epsilon} = 0, we report peculiar jump-on canards, which block a continuous transition to bursting. In the biologically more plausible regime of {epsilon} this transition becomes continuous and bursts emerge via consecutive spike-adding transitions. The onset of bursting is of complex nature and involves mixed-type like torus canards, which form the very first spikes of the burst and revolve nearby fast-subsystem repelling limit cycles. We provide numerical evidence for the same mechanisms to be responsible for the emergence of bursting in the quadratic integrate & fire network with plastic synapses. The main conclusions apply for the network, owing to the exactness of the meanfield limit.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا