ترغب بنشر مسار تعليمي؟ اضغط هنا

Building a Dynamical Network Model from Neural Spiking Data: Application of Poisson Likelihood

74   0   0.0 ( 0 )
 نشر من قبل R. Ozgur Doruk
 تاريخ النشر 2017
  مجال البحث علم الأحياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Research showed that, the information transmitted in biological neurons is encoded in the instants of successive action potentials or their firing rate. In addition to that, in-vivo operation of the neuron makes measurement difficult and thus continuous data collection is restricted. Due to those reasons, classical mean square estimation techniques that are frequently used in neural network training is very difficult to apply. In such situations, point processes and related likelihood methods may be beneficial. In this study, we will present how one can apply certain methods to use the stimulus-response data obtained from a neural process in the mathematical modeling of a neuron. The study is theoretical in nature and it will be supported by simulations. In addition it will be compared to a similar study performed on the same network model.

قيم البحث

اقرأ أيضاً

We address the problem of efficiently and informatively quantifying how multiplets of variables carry information about the future of the dynamical system they belong to. In particular we want to identify groups of variables carrying redundant or syn ergistic information, and track how the size and the composition of these multiplets changes as the collective behavior of the system evolves. In order to afford a parsimonious expansion of shared information, and at the same time control for lagged interactions and common effect, we develop a dynamical, conditioned version of the O-information, a framework recently proposed to quantify high-order interdependencies via multivariate extension of the mutual information. We thus obtain an expansion of the transfer entropy in which synergistic and redundant effects are separated. We apply this framework to a dataset of spiking neurons from a monkey performing a perceptual discrimination task. The method identifies synergistic multiplets that include neurons previously categorized as containing little relevant information individually.
As the limits of traditional von Neumann computing come into view, the brains ability to communicate vast quantities of information using low-power spikes has become an increasing source of inspiration for alternative architectures. Key to the succes s of these largescale neural networks is a power-efficient spiking element that is scalable and easily interfaced with traditional control electronics. In this work, we present a spiking element fabricated from superconducting nanowires that has pulse energies on the order of ~10 aJ. We demonstrate that the device reproduces essential characteristics of biological neurons, such as a refractory period and a firing threshold. Through simulations using experimentally measured device parameters, we show how nanowire-based networks may be used for inference in image recognition, and that the probabilistic nature of nanowire switching may be exploited for modeling biological processes and for applications that rely on stochasticity.
We extend the scope of the dynamical theory of extreme values to cover phenomena that do not happen instantaneously, but evolve over a finite, albeit unknown at the onset, time interval. We consider complex dynamical systems, composed of many individ ual subsystems linked by a network of interactions. As a specific example of the general theory, a model of neural network, introduced to describe the electrical activity of the cerebral cortex, is analyzed in detail: on the basis of this analysis we propose a novel definition of neuronal cascade, a physiological phenomenon of primary importance. We derive extreme value laws for the statistics of these cascades, both from the point of view of exceedances (that satisfy critical scaling theory) and of block maxima.
For the nervous system to work at all, a delicate balance of excitation and inhibition must be achieved. However, when such a balance is sought by global strategies, only few modes remain balanced close to instability, and all other modes are strongl y stable. Here we present a simple model of neural tissue in which this balance is sought locally by neurons following `anti-Hebbian behavior: {sl all} degrees of freedom achieve a close balance of excitation and inhibition and become critical in the dynamical sense. At long timescales, the modes of our model oscillate around the instability line, so an extremely complex breakout dynamics ensues in which different modes of the system oscillate between prominence and extinction. We show the system develops various anomalous statistical behaviours and hence becomes self-organized critical in the statistical sense.
Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several int eracting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50 -- 2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics like finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly simulate a model of a local cortical microcircuit consisting of eight neuron types. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا