ترغب بنشر مسار تعليمي؟ اضغط هنا

Bistable firing pattern in a neural network model

143   0   0.0 ( 0 )
 نشر من قبل Paulo Protachevicz
 تاريخ النشر 2018
  مجال البحث فيزياء علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Excessively high, neural synchronisation has been associated with epileptic seizures, one of the most common brain diseases worldwide. A better understanding of neural synchronisation mechanisms can thus help control or even treat epilepsy. In this paper, we study neural synchronisation in a random network where nodes are neurons with excitatory and inhibitory synapses, and neural activity for each node is provided by the adaptive exponential integrate-and-fire model. In this framework, we verify that the decrease in the influence of inhibition can generate synchronisation originating from a pattern of desynchronised spikes. The transition from desynchronous spikes to synchronous bursts of activity, induced by varying the synaptic coupling, emerges in a hysteresis loop due to bistability where abnormal (excessively high synchronous) regimes exist. We verify that, for parameters in the bistability regime, a square current pulse can trigger excessively high (abnormal) synchronisation, a process that can reproduce features of epileptic seizures. Then, we show that it is possible to suppress such abnormal synchronisation by applying a small-amplitude external current on less than 10% of the neurons in the network. Our results demonstrate that external electrical stimulation not only can trigger synchronous behaviour, but more importantly, it can be used as a means to reduce abnormal synchronisation and thus, control or treat effectively epileptic seizures.

قيم البحث

اقرأ أيضاً

Brain network is remarkably cost-efficient while the fundamental physical and dynamical mechanisms underlying its economical optimization in network structure and activity are not clear. Here we study intricate cost-efficient interplay between struct ure and dynamics in biologically plausible spatial modular neuronal network models. We find that critical avalanche states from excitation-inhibition balance, under modular network topology with less wiring cost, can also achieve less costs in firing, but with strongly enhanced response sensitivity to stimuli. We derived mean-field equations that govern the macroscopic network dynamics through a novel approximate theory. The mechanism of low firing cost and stronger response in the form of critical avalanche is explained as a proximity to a Hopf bifurcation of the modules when increasing their connection density. Our work reveals the generic mechanism underlying the cost-efficient modular organization and critical dynamics widely observed in neural systems, providing insights to brain-inspired efficient computational designs.
56 - Eve Armstrong 2017
Central pattern generators (CPGs) appear to have evolved multiple times throughout the animal kingdom, indicating that their design imparts a significant evolutionary advantage. Insight into how this design is achieved is hindered by the difficulty i nherent in examining relationships among electrophysiological properties of the constituent cells of a CPG and their functional connectivity. That is: experimentally it is challenging to estimate the values of more than two or three of these properties simultaneously. We employ a method of statistical data assimilation (D.A.) to estimate the synaptic weights, synaptic reversal potentials, and maximum conductances of ion channels of the constituent neurons in a multi-modal network model. We then use these estimates to predict the functional mode of activity that the network is expressing. The measurements used are the membrane voltage time series of all neurons in the circuit. We find that these measurements provide sufficient information to yield accurate predictions of the networks associated electrical activity. This experiment can apply directly in a real laboratory using intracellular recordings from a known biological CPG whose structural mapping is known, and which can be completely isolated from the animal. The simulated results in this paper suggest that D.A. might provide a tool for simultaneously estimating tens to hundreds of CPG properties, thereby offering the opportunity to seek possible systematic relationships among these properties and the emergent electrical activity.
In this work, we study the dynamic range in a neuronal network modelled by cellular automaton. We consider deterministic and non-deterministic rules to simulate electrical and chemical synapses. Chemical synapses have an intrinsic time-delay and are susceptible to parameter variations guided by learning Hebbian rules of behaviour. Our results show that chemical synapses can abruptly enhance sensibility of the neural network, a manifestation that can become even more predominant if learning rules of evolution are applied to the chemical synapses.
Computers has been endowed with a part of human-like intelligence owing to the rapid development of the artificial intelligence technology represented by the neural networks. Facing the challenge to make machines more imaginative, we consider a quant um stochastic neural network (QSNN), and propose a learning algorithm to update the parameters governing the network evolution. The QSNN can be applied to a class of classification problems, we investigate its performance in sentence classification and find that the coherent part of the quantum evolution can accelerate training, and improve the accuracy of verses recognition which can be deemed as a quantum enhanced associative memory. In addition, the coherent QSNN is found more robust against both label noise and device noise so that it is a more adaptive option for practical implementation.
Noninvasive medical neuroimaging has yielded many discoveries about the brain connectivity. Several substantial techniques mapping morphological, structural and functional brain connectivities were developed to create a comprehensive road map of neur onal activities in the human brain -namely brain graph. Relying on its non-Euclidean data type, graph neural network (GNN) provides a clever way of learning the deep graph structure and it is rapidly becoming the state-of-the-art leading to enhanced performance in various network neuroscience tasks. Here we review current GNN-based methods, highlighting the ways that they have been used in several applications related to brain graphs such as missing brain graph synthesis and disease classification. We conclude by charting a path toward a better application of GNN models in network neuroscience field for neurological disorder diagnosis and population graph integration. The list of papers cited in our work is available at https://github.com/basiralab/GNNs-in-Network-Neuroscience.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا