No Arabic abstract
A disordered spin glass model where both static and dynamical properties depend on macroscopic magnetizations is presented. These magnetizations interact via random couplings and, therefore, the typical quenched realization of the system exhibit a macroscopic frustration. The model is solved by using a revisited replica approach, and the broken symmetry solution turns out to coincide with the symmetric solution. Some dynamical aspects of the model are also discussed, showing how it could be a useful tool for describing some properties of real systems as, for example, natural ecosystems or human social systems.
Spin-spin correlations are calculated in frustrated hierarchical Ising models that exhibit chaotic renormalization-group behavior. The spin-spin correlations, as a function of distance, behave chaotically. The far correlations, but not the near correlations, are sensitive to small changes in temperature or frustration, with temperature changes having a larger effect. On the other hand, the calculated free energy, internal energy, and entropy are smooth functions of temperature. The recursion-matrix calculation of thermodynamic densities in a chaotic band is demonstrated. The leading Lyapunov exponents are calculated as a function of frustration.
Networks that have power-law connectivity, commonly referred to as the scale-free networks, are an important class of complex networks. A heterogeneous mean-field approximation has been previously proposed for the Ising model of the Barab{a}si-Albert model of scale-free networks with classical spins on the nodes wherein it was shown that the critical temperature for such a system scales logarithmically with network size. For finite sizes, there is no criticality for such a system and hence no true phase transition in terms of singular behavior. Further, in the thermodynamic limit, the mean-field prediction of an infinite critical temperature for the system may exclude any true phase transition even then. Nevertheless, with an eye on potential applications of the model on biological systems that are generally finite, one may still try to find approximations that describe the relevant observables quantitatively. Here we present an alternative, approximate formulation for the description of the Ising model of a Barab{a}si-Albert Network. Using the classical definition of magnetization, we show that Ising models on a network can be well-approximated by a long-range interacting homogeneous Ising model wherein each node of the network couples to all other spins with a strength determined by the mean degree of the Barab{a}si-Albert Network. In such an effective long-range Ising model of a Barab{a}si-Albert Network, the critical temperature is directly proportional to the number of preferentially attached links added to grow the network. The proposed model describes the magnetization of the majority of the sites with average or smaller than average degree better compared to the heterogeneous mean-field approximation. The long-range Ising model is the only homogeneous description of Barab{a}si-Albert networks that we know of.
We discuss universal and non-universal critical exponents of a three dimensional Ising system in the presence of weak quenched disorder. Both experimental, computational, and theoretical results are reviewed. Special attention is paid to the results obtained by the field theoretical renormalization group approach. Different renormalization schemes are considered putting emphasis on analysis of divergent series obtained.
It is well established that neural networks with deep architectures perform better than shallow networks for many tasks in machine learning. In statistical physics, while there has been recent interest in representing physical data with generative modelling, the focus has been on shallow neural networks. A natural question to ask is whether deep neural networks hold any advantage over shallow networks in representing such data. We investigate this question by using unsupervised, generative graphical models to learn the probability distribution of a two-dimensional Ising system. Deep Boltzmann machines, deep belief networks, and deep restricted Boltzmann networks are trained on thermal spin configurations from this system, and compared to the shallow architecture of the restricted Boltzmann machine. We benchmark the models, focussing on the accuracy of generating energetic observables near the phase transition, where these quantities are most difficult to approximate. Interestingly, after training the generative networks, we observe that the accuracy essentially depends only on the number of neurons in the first hidden layer of the network, and not on other model details such as network depth or model type. This is evidence that shallow networks are more efficient than deep networks at representing physical probability distributions associated with Ising systems near criticality.
Recent advances in deep learning and neural networks have led to an increased interest in the application of generative models in statistical and condensed matter physics. In particular, restricted Boltzmann machines (RBMs) and variational autoencoders (VAEs) as specific classes of neural networks have been successfully applied in the context of physical feature extraction and representation learning. Despite these successes, however, there is only limited understanding of their representational properties and limitations. To better understand the representational characteristics of RBMs and VAEs, we study their ability to capture physical features of the Ising model at different temperatures. This approach allows us to quantitatively assess learned representations by comparing sample features with corresponding theoretical predictions. Our results suggest that the considered RBMs and convolutional VAEs are able to capture the temperature dependence of magnetization, energy, and spin-spin correlations. The samples generated by RBMs are more evenly distributed across temperature than those generated by VAEs. We also find that convolutional layers in VAEs are important to model spin correlations whereas RBMs achieve similar or even better performances without convolutional filters.