The thermodynamic and retrieval properties of the Ashkin-Teller neural network model storing an infinite number of patterns are examined in the replica-symmetric mean-field approximation. In particular, for linked patterns temperature-capacity phase diagrams are derived for different values of the two-neuron and four-neuron coupling strengths. This model can be considered as a particular non-trivial generalisation of the Hopfield model and exhibits a number of interesting new features. Some aspects of replica-symmetry breaking are discussed.
An Ashkin-Teller neural network, allowing for two types of neurons is considered in the case of low loading as a function of the strength of the respective couplings between these neurons. The storage and retrieval of embedded patterns built from the two types of neurons, with different degrees of (in)dependence is studied. In particular, thermodynamic properties including the existence and stability of Mattis states are discussed. Furthermore, the dynamic behaviour is examined by deriving flow equations for the macroscopic overlap. It is found that for linked patterns the model shows better retrieval properties than a corresponding Hopfield model.
We show that for a particular choice of the coupling parameters the Ashkin-Teller spin-glass neural network model with the Hebb learning rule and one condensed pattern yields the same thermodynamic properties as the four-state anisotropic Potts-glass neural network model. This equivalence is not seen at the level of the Hamiltonians.
The dynamics and the stationary states of an exactly solvable three-state layered feed-forward neural network model with asymmetric synaptic connections, finite dilution and low pattern activity are studied in extension of a recent work on a recurrent network. Detailed phase diagrams are obtained for the stationary states and for the time evolution of the retrieval overlap with a single pattern. It is shown that the network develops instabilities for low thresholds and that there is a gradual improvement in network performance with increasing threshold up to an optimal stage. The robustness to synaptic noise is checked and the effects of dilution and of variable threshold on the information content of the network are also established.
The parallel dynamics of the fully connected Blume-Emery-Griffiths neural network model is studied for arbitrary temperature. By employing a probabilistic signal-to-noise approach, a recursive scheme is found determining the time evolution of the distribution of the local fields and, hence, the evolution of the order parameters. A comparison of this approach is made with the generating functional method, allowing to calculate any physical relevant quantity as a function of time. Explicit analytic formula are given in both methods for the first few time steps of the dynamics. Up to the third time step the results are identical. Some arguments are presented why beyond the third time step the results differ for certain values of the model parameters. Furthermore, fixed-point equations are derived in the stationary limit. Numerical simulations confirm our theoretical findings.
The optimal capacity of a diluted Blume-Emery-Griffiths neural network is studied as a function of the pattern activity and the embedding stability using the Gardner entropy approach. Annealed dilution is considered, cutting some of the couplings referring to the ternary patterns themselves and some of the couplings related to the active patterns, both simultaneously (synchronous dilution) or independently (asynchronous dilution). Through the de Almeida-Thouless criterion it is found that the replica-symmetric solution is locally unstable as soon as there is dilution. The distribution of the couplings shows the typical gap with a width depending on the amount of dilution, but this gap persists even in cases where a particular type of coupling plays no role in the learning process.