ﻻ يوجد ملخص باللغة العربية
Rhythmic electrical activity in the brain emerges from regular non-trivial interactions between millions of neurons. Neurons are intricate cellular structures that transmit excitatory (or inhibitory) signals to other neurons, often non-locally, depending on the graded input from other neurons. Often this requires extensive detail to model mathematically, which poses several issues in modelling large systems beyond clusters of neurons, such as the whole brain. Approaching large populations of neurons with interconnected constituent single-neuron models results in an accumulation of exponentially many complexities, rendering a realistic simulation that does not permit mathematical tractability and obfuscates the primary interactions required for emergent electrodynamical patterns in brain rhythms. A statistical mechanics approach with non-local interactions may circumvent these issues while maintaining mathematically tractability. Neural field theory is a population-level approach to modelling large sections of neural tissue based on these principles. Herein we provide a review of key stages of the history and development of neural field theory and contemporary uses of this branch of mathematical neuroscience. We elucidate a mathematical framework in which neural field models can be derived, highlighting the many significant inherited assumptions that exist in the current literature, so that their validity may be considered in light of further developments in both mathematical and experimental neuroscience.
Topographic maps are a brain structure connecting pre-synpatic and post-synaptic brain regions. Topographic development is dependent on Hebbian-based plasticity mechanisms working in conjunction with spontaneous patterns of neural activity generated
The theory of communication through coherence (CTC) proposes that brain oscillations reflect changes in the excitability of neurons, and therefore the successful communication between two oscillating neural populations depends not only on the strengt
In the last few years, deep learning has led to very good performance on a variety of problems, such as visual recognition, speech recognition and natural language processing. Among different types of deep neural networks, convolutional neural networ
Over the past few years, adversarial training has become an extremely active research topic and has been successfully applied to various Artificial Intelligence (AI) domains. As a potentially crucial technique for the development of the next generati
Noise-induced population bursting has been widely identified to play important roles in the information process. We constructed a mathematical model for a random and sparse neural network where bursting can be induced from the resting state by the gl