Do you want to publish a course? Click here

Development of Topographic Maps in Neural Field Theory with Short Time Scale Dependent Plasticity

103   0   0.0 ( 0 )
 Added by Nicholas Gale
 Publication date 2021
  fields Biology
and research's language is English




Ask ChatGPT about the research

Topographic maps are a brain structure connecting pre-synpatic and post-synaptic brain regions. Topographic development is dependent on Hebbian-based plasticity mechanisms working in conjunction with spontaneous patterns of neural activity generated in the pre-synaptic regions. Studies performed in mouse have shown that these spontaneous patterns can exhibit complex spatial-temporal structures which existing models cannot incorporate. Neural field theories are appropriate modelling paradigms for topographic systems due to the dense nature of the connections between regions and can be augmented with a plasticity rule general enough to capture complex time-varying structures. We propose a theoretical framework for studying the development of topography in the context of complex spatial-temporal activity fed-forward from the pre-synaptic to post-synaptic regions. Analysis of the model leads to an analytic solution corroborating the conclusion that activity can drive the refinement of topographic projections. The analysis also suggests that biological noise is used in the development of topography to stabilise the dynamics. MCMC simulations are used to analyse and understand the differences in topographic refinement between wild-type and the $beta2$ knock-out mutant in mice. The time scale of the synaptic plasticity window is estimated as $0.56$ seconds in this context with a model fit of $R^2 = 0.81$.



rate research

Read More

We show that the local Spike Timing-Dependent Plasticity (STDP) rule has the effect of regulating the trans-synaptic weights of loops of any length within a simulated network of neurons. We show that depending on STDPs polarity, functional loops are formed or eliminated in networks driven to normal spiking conditions by random, partially correlated inputs, where functional loops comprise weights that exceed a non-zero threshold. We further prove that STDP is a form of loop-regulating plasticity for the case of a linear network comprising random weights drawn from certain distributions. Thus a notable local synaptic learning rule makes a specific prediction about synapses in the brain in which standard STDP is present: that under normal spiking conditions, they should participate in predominantly feed-forward connections at all scales. Our model implies that any deviations from this prediction would require a substantial modification to the hypothesized role for standard STDP. Given its widespread occurrence in the brain, we predict that STDP could also regulate long range synaptic loops among individual neurons across all brain scales, up to, and including, the scale of global brain network topology.
Neural populations exposed to a certain stimulus learn to represent it better. However, the process that leads local, self-organized rules to do so is unclear. We address the question of how can a neural periodic input be learned and use the Differential Hebbian Learning framework, coupled with a homeostatic mechanism to derive two self-consistency equations that lead to increased responses to the same stimulus. Although all our simulations are done with simple Leaky-Integrate and Fire neurons and standard Spiking Time Dependent Plasticity learning rules, our results can be easily interpreted in terms of rates and population codes.
Coarse-graining microscopic models of biological neural networks to obtain mesoscopic models of neural activities is an essential step towards multi-scale models of the brain. Here, we extend a recent theory for mesoscopic population dynamics with static synapses to the case of dynamic synapses exhibiting short-term plasticity (STP). Under the assumption that spike arrivals at synapses have Poisson statistics, we derive analytically stochastic mean-field dynamics for the effective synaptic coupling between finite-size populations undergoing Tsodyks-Markram STP. The novel mean-field equations account for both finite number of synapses and correlations between the neurotransmitter release probability and the fraction of available synaptic resources. Comparisons with Monte Carlo simulations of the microscopic model show that in both feedforward and recurrent networks the mesoscopic mean-field model accurately reproduces stochastic realizations of the total synaptic input into a postsynaptic neuron and accounts for stochastic switches between Up and Down states as well as for population spikes. The extended mesoscopic population theory of spiking neural networks with STP may be useful for a systematic reduction of detailed biophysical models of cortical microcircuits to efficient and mathematically tractable mean-field models.
Latency reduction of postsynaptic spikes is a well-known effect of Synaptic Time-Dependent Plasticity. We expand this notion for long postsynaptic spike trains, showing that, for a fixed input spike train, STDP reduces the number of postsynaptic spikes and concentrates the remaining ones. Then we study the consequences of this phenomena in terms of coding, finding that this mechanism improves the neural code by increasing the signal-to-noise ratio and lowering the metabolic costs of frequent stimuli. Finally, we illustrate that the reduction of postsynaptic latencies can lead to the emergence of predictions.
Rhythmic electrical activity in the brain emerges from regular non-trivial interactions between millions of neurons. Neurons are intricate cellular structures that transmit excitatory (or inhibitory) signals to other neurons, often non-locally, depending on the graded input from other neurons. Often this requires extensive detail to model mathematically, which poses several issues in modelling large systems beyond clusters of neurons, such as the whole brain. Approaching large populations of neurons with interconnected constituent single-neuron models results in an accumulation of exponentially many complexities, rendering a realistic simulation that does not permit mathematical tractability and obfuscates the primary interactions required for emergent electrodynamical patterns in brain rhythms. A statistical mechanics approach with non-local interactions may circumvent these issues while maintaining mathematically tractability. Neural field theory is a population-level approach to modelling large sections of neural tissue based on these principles. Herein we provide a review of key stages of the history and development of neural field theory and contemporary uses of this branch of mathematical neuroscience. We elucidate a mathematical framework in which neural field models can be derived, highlighting the many significant inherited assumptions that exist in the current literature, so that their validity may be considered in light of further developments in both mathematical and experimental neuroscience.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا