Do you want to publish a course? Click here

Retrieval and Chaos in Extremely Diluted Non-Monotonic Neural Networks

69   0   0.0 ( 0 )
 Publication date 2002
  fields Physics Biology
and research's language is English




Ask ChatGPT about the research

We discuss, in this paper, the dynamical properties of extremely diluted, non-monotonic neural networks. Assuming parallel updating and the Hebb prescription for the synaptic connections, a flow equation for the macroscopic overlap is derived. A rich dynamical phase diagram was obtained, showing a stable retrieval phase, as well as a cycle two and chaotic behavior. Numerical simulations were performed, showing good agreement with analytical results. Furthermore, the simulations give an additional insight into the microscopic dynamical behavior during the chaotic phase. It is shown that the freezing of individual neuron states is related to the structure of chaotic attractors.



rate research

Read More

The retrieval behavior and thermodynamic properties of symmetrically diluted Q-Ising neural networks are derived and studied in replica-symmetric mean-field theory generalizing earlier works on either the fully connected or the symmetrical extremely diluted network. Capacity-gain parameter phase diagrams are obtained for the Q=3, Q=4 and $Q=infty$ state networks with uniformly distributed patterns of low activity in order to search for the effects of a gradual dilution of the synapses. It is shown that enlarged regions of continuous changeover into a region of optimal performance are obtained for finite stochastic noise and small but finite connectivity. The de Almeida-Thouless lines of stability are obtained for arbitrary connectivity, and the resulting phase diagrams are used to draw conclusions on the behavior of symmetrically diluted networks with other pattern distributions of either high or low activity.
144 - R. Zillmer , R. Livi , A. Politi 2006
The dynamical behaviour of a weakly diluted fully-inhibitory network of pulse-coupled spiking neurons is investigated. Upon increasing the coupling strength, a transition from regular to stochastic-like regime is observed. In the weak-coupling phase, a periodic dynamics is rapidly approached, with all neurons firing with the same rate and mutually phase-locked. The strong-coupling phase is characterized by an irregular pattern, even though the maximum Lyapunov exponent is negative. The paradox is solved by drawing an analogy with the phenomenon of ``stable chaos, i.e. by observing that the stochastic-like behaviour is limited to a an exponentially long (with the system size) transient. Remarkably, the transient dynamics turns out to be stationary.
We introduce a model of generalized Hebbian learning and retrieval in oscillatory neural networks modeling cortical areas such as hippocampus and olfactory cortex. Recent experiments have shown that synaptic plasticity depends on spike timing, especially on synapses from excitatory pyramidal cells, in hippocampus and in sensory and cerebellar cortex. Here we study how such plasticity can be used to form memories and input representations when the neural dynamics are oscillatory, as is common in the brain (particularly in the hippocampus and olfactory cortex). Learning is assumed to occur in a phase of neural plasticity, in which the network is clamped to external teaching signals. By suitable manipulation of the nonlinearity of the neurons or of the oscillation frequencies during learning, the model can be made, in a retrieval phase, either to categorize new inputs or to map them, in a continuous fashion, onto the space spanned by the imprinted patterns. We identify the first of these possibilities with the function of olfactory cortex and the second with the observed response characteristics of place cells in hippocampus. We investigate both kinds of networks analytically and by computer simulations, and we link the models with experimental findings, exploring, in particular, how the spike timing dependence of the synaptic plasticity constrains the computational function of the network and vice versa.
146 - D.Bolle , T.Verbeiren 1999
The optimal capacity of graded-response perceptrons storing biased and spatially correlated patterns with non-monotonic input-output relations is studied. It is shown that only the structure of the output patterns is important for the overall performance of the perceptrons.
Water ice and spin ice are important model systems in which theory can directly account for zero point entropy associated with quenched configurational disorder. Spin ice differs from water ice in the important respect that its fundamental constituents, the spins of the magnetic ions, can be removed through replacement with non-magnetic ions while keeping the lattice structure intact. In order to investigate the interplay of frustrated interactions and quenched disorder, we have performed systematic heat capacity measurements on spin ice materials which have been thus diluted up to 90%. Investigations of both Ho and Dy spin ices reveal that the zero point entropy depends non-monotonically on dilution and approaches the value of Rln2 in the limit of high dilution. The data are in good agreement with a generalization of Paulings theory for the entropy of ice.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا