ﻻ يوجد ملخص باللغة العربية
Recent advances in deep learning and neural networks have led to an increased interest in the application of generative models in statistical and condensed matter physics. In particular, restricted Boltzmann machines (RBMs) and variational autoencoders (VAEs) as specific classes of neural networks have been successfully applied in the context of physical feature extraction and representation learning. Despite these successes, however, there is only limited understanding of their representational properties and limitations. To better understand the representational characteristics of RBMs and VAEs, we study their ability to capture physical features of the Ising model at different temperatures. This approach allows us to quantitatively assess learned representations by comparing sample features with corresponding theoretical predictions. Our results suggest that the considered RBMs and convolutional VAEs are able to capture the temperature dependence of magnetization, energy, and spin-spin correlations. The samples generated by RBMs are more evenly distributed across temperature than those generated by VAEs. We also find that convolutional layers in VAEs are important to model spin correlations whereas RBMs achieve similar or even better performances without convolutional filters.
We investigate the behavior of the Ising model on two connected Barabasi-Albert scale-free networks. We extend previous analysis and show that a first order temperature-driven phase transition occurs in such system. The transition between antiparalel
The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperatur
We study critical behavior of the diluted 2D Ising model in the presence of disorder correlations which decay algebraically with distance as $sim r^{-a}$. Mapping the problem onto 2D Dirac fermions with correlated disorder we calculate the critical p
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solve
We study the dynamic and metastable properties of the fully connected Ising $p$-spin model with finite number of variables. We define trapping energies, trapping times and self correlation functions and we analyse their statistical properties in comparison to the predictions of trap models.