ترغب بنشر مسار تعليمي؟ اضغط هنا

Fluctuation-response Relation Unifies Dynamical Behaviors in Neural Fields

106   0   0.0 ( 0 )
 نشر من قبل C.C. Alan Fung
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Anticipation is a strategy used by neural fields to compensate for transmission and processing delays during the tracking of dynamical information, and can be achieved by slow, localized, inhibitory feedback mechanisms such as short-term synaptic depression, spike-frequency adaptation, or inhibitory feedback from other layers. Based on the translational symmetry of the mobile network states, we derive generic fluctuation-response relations, providing unified predictions that link their tracking behaviors in the presence of external stimuli to the intrinsic dynamics of the neural fields in their absence.



قيم البحث

اقرأ أيضاً

We discuss fluctuation-induced forces in a system described by a continuous Landau-Ginzburg model with a quenched disorder field, defined in a $d$-dimensional slab geometry $mathbb R^{d-1}times[0,L]$. A series representation for the quenched free ene rgy in terms of the moments of the partition function is presented. In each moment an order parameter-like quantity can be defined, with a particular correlation length of the fluctuations. For some specific strength of the non-thermal control parameter, it appears a moment of the partition function where the fluctuations associated to the order parameter-like quantity becomes long-ranged. In this situation, these fluctuations become sensitive to the boundaries. In the Gaussian approximation, using the spectral zeta-function method, we evaluate a functional determinant for each moment of the partition function. The analytic structure of each spectral zeta-function depending on the dimension of the space for the case of Dirichlet, Neumann Laplacian and also periodic boundary conditions is discussed in a unified way. Considering the moment of the partition function with the largest correlation length of the fluctuations, we evaluate the induced force between the boundaries, for Dirichlet boundary conditions. We prove that the sign of the fluctuation-induced force for this case depend in a non-trivial way on the strength of the non-thermal control parameter.
Neural networks are able to extract information from the timing of spikes. Here we provide new results on the behavior of the simplest neuronal model which is able to decode information embedded in temporal spike patterns, the so called tempotron. Us ing statistical physics techniques we compute the capacity for the case of sparse, time-discretized input, and material discrete synapses, showing that the device saturates the information theoretic bounds with a statistics of output spikes that is consistent with the statistics of the inputs. We also derive two simple and highly efficient learning algorithms which are able to learn a number of associations which are close to the theoretical limit. The simple
We study with numerical simulation the possible limit behaviors of synchronous discrete-time deterministic recurrent neural networks composed of N binary neurons as a function of a networks level of dilution and asymmetry. The network dilution measur es the fraction of neuron couples that are connected, and the network asymmetry measures to what extent the underlying connectivity matrix is asymmetric. For each given neural network, we study the dynamical evolution of all the different initial conditions, thus characterizing the full dynamical landscape without imposing any learning rule. Because of the deterministic dynamics, each trajectory converges to an attractor, that can be either a fixed point or a limit cycle. These attractors form the set of all the possible limit behaviors of the neural network. For each network, we then determine the convergence times, the limit cycles length, the number of attractors, and the sizes of the attractors basin. We show that there are two network structures that maximize the number of possible limit behaviors. The first optimal network structure is fully-connected and symmetric. On the contrary, the second optimal network structure is highly sparse and asymmetric. The latter optimal is similar to what observed in different biological neuronal circuits. These observations lead us to hypothesize that independently from any given learning model, an efficient and effective biologic network that stores a number of limit behaviors close to its maximum capacity tends to develop a connectivity structure similar to one of the optimal networks we found.
146 - D.Bolle , T.Verbeiren 1999
The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.
We systematically study and compare damage spreading for random Boolean and threshold networks under small external perturbations (damage), a problem which is relevant to many biological networks. We identify a new characteristic connectivity $K_s$, at which the average number of damaged nodes after a large number of dynamical updates is independent of the total number of nodes $N$. We estimate the critical connectivity for finite $N$ and show that it systematically deviates from the annealed approximation. Extending the approach followed in a previous study, we present new results indicating that internal dynamical correlations tend to increase not only the probability for small, but also for very large damage events, leading to a broad, fat-tailed distribution of damage sizes. These findings indicate that the descriptive and predictive value of averaged order parameters for finite size networks - even for biologically highly relevant sizes up to several thousand nodes - is limited.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا