Do you want to publish a course? Click here

Surround Inhibition Mechanism by Deep Learning

169   0   0.0 ( 0 )
 Added by Mamoru Sugamoto
 Publication date 2019
  fields Biology Physics
and research's language is English




Ask ChatGPT about the research

In the sensation of tones, visions and other stimuli, the surround inhibition mechanism (or lateral inhibition mechanism) is crucial. The mechanism enhances the signals of the strongest tone, color and other stimuli, by reducing and inhibiting the surrounding signals, since the latter signals are less important. This surround inhibition mechanism is well studied in the physiology of sensor systems. The neural network with two hidden layers in addition to input and output layers is constructed; having 60 neurons (units) in each of the four layers. The label (correct answer) is prepared from an input signal by applying seven times operations of the Hartline mechanism, that is, by sending inhibitory signals from the neighboring neurons and amplifying all the signals afterwards. The implication obtained by the deep learning of this neural network is compared with the standard physiological understanding of the surround inhibition mechanism.



rate research

Read More

The estimation of causal network architectures in the brain is fundamental for understanding cognitive information processes. However, access to the dynamic processes underlying cognition is limited to indirect measurements of the hidden neuronal activity, for instance through fMRI data. Thus, estimating the network structure of the underlying process is challenging. In this article, we embed an adaptive importance sampler called Adaptive Path Integral Smoother (APIS) into the Expectation-Maximization algorithm to obtain point estimates of causal connectivity. We demonstrate on synthetic data that this procedure finds not only the correct network structure but also the direction of effective connections from random initializations of the connectivity matrix. In addition--motivated by contradictory claims in the literature--we examine the effect of the neuronal timescale on the sensitivity of the BOLD signal to changes in the connectivity and on the maximum likelihood solutions of the connectivity. We conclude with two warnings: First, the connectivity estimates under the assumption of slow dynamics can be extremely biased if the data was generated by fast neuronal processes. Second, the faster the time scale, the less sensitive the BOLD signal is to changes in the incoming connections to a node. Hence, connectivity estimation using realistic neural dynamics timescale requires extremely high-quality data and seems infeasible in many practical data sets.
Fast-spiking (FS) interneurons in the brain are self-innervated by powerful inhibitory GABAergic autaptic connections. By computational modelling, we investigate how autaptic inhibition regulates the firing response of such interneurons. Our results indicate that autaptic inhibition both boosts the current threshold for action potential generation as well as modulates the input-output gain of FS interneurons. The autaptic transmission delay is identified as a key parameter that controls the firing patterns and determines multistability regions of FS interneurons. Furthermore, we observe that neuronal noise influences the firing regulation of FS interneurons by autaptic inhibition and extends their dynamic range for encoding inputs. Importantly, autaptic inhibition modulates noise-induced irregular firing of FS interneurons, such that coherent firing appears at an optimal autaptic inhibition level. Our result reveal the functional roles of autaptic inhibition in taming the firing dynamics of FS interneurons.
Spike time response curves (STRCs) are used to study the influence of synaptic stimuli on the firing times of a neuron oscillator without the assumption of weak coupling. They allow us to approximate the dynamics of synchronous state in networks of neurons through a discrete map. Linearization about the fixed point of the discrete map can then be used to predict the stability of patterns of synchrony in the network. General theory for taking into account the contribution from higher order STRC terms, in the approximation of the discrete map for coupled neuronal oscillators in synchrony is still lacking. Here we present a general framework to account for higher order STRC corrections in the approximation of discrete map to determine the domain of 1:1 phase locking state in the network of two interacting neurons. We begin by demonstrating that the effect of synaptic stimuli through a shunting synapse to a neuron firing in the gamma frequency band (20-80 Hz) last for three consecutive firing cycles. We then show that the discrete map derived by taking into account the higher order STRC contributions is successfully able predict the domain of synchronous 1:1 phase locked state in a network of two heterogeneous interneurons coupled through a shunting synapse.
Most information dynamics and statistical causal analysis frameworks rely on the common intuition that causal interactions are intrinsically pairwise -- every cause variable has an associated effect variable, so that a causal arrow can be drawn between them. However, analyses that depict interdependencies as directed graphs fail to discriminate the rich variety of modes of information flow that can coexist within a system. This, in turn, creates problems with attempts to operationalise the concepts of dynamical complexity or `integrated information. To address this shortcoming, we combine concepts of partial information decomposition and integrated information, and obtain what we call Integrated Information Decomposition, or $Phi$ID. We show how $Phi$ID paves the way for more detailed analyses of interdependencies in multivariate time series, and sheds light on collective modes of information dynamics that have not been reported before. Additionally, $Phi$ID reveals that what is typically referred to as integration is actually an aggregate of several heterogeneous phenomena. Furthermore, $Phi$ID can be used to formulate new, tailored measures of integrated information, as well as to understand and alleviate the limitations of existing measures.
We assess electrical brain dynamics before, during, and after one-hundred human epileptic seizures with different anatomical onset locations by statistical and spectral properties of functionally defined networks. We observe a concave-like temporal evolution of characteristic path length and cluster coefficient indicative of a movement from a more random toward a more regular and then back toward a more random functional topology. Surprisingly, synchronizability was significantly decreased during the seizure state but increased already prior to seizure end. Our findings underline the high relevance of studying complex systems from the view point of complex networks, which may help to gain deeper insights into the complicated dynamics underlying epileptic seizures.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا