No Arabic abstract
We present an analysis of neural network-based machine learning schemes for phases and phase transitions in theoretical condensed matter research, focusing on neural networks with a single hidden layer. Such shallow neural networks were previously found to be efficient in classifying phases and locating phase transitions of various basic model systems. In order to rationalize the emergence of the classification process and for identifying any underlying physical quantities, it is feasible to examine the weight matrices and the convolutional filter kernels that result from the learning process of such shallow networks. Furthermore, we demonstrate how the learning-by-confusing scheme can be used, in combination with a simple threshold-value classification method, to diagnose the learning parameters of neural networks. In particular, we study the classification process of both fully-connected and convolutional neural networks for the two-dimensional Ising model with extended domain wall configurations included in the low-temperature regime. Moreover, we consider the two-dimensional XY model and contrast the performance of the learning-by-confusing scheme and convolutional neural networks trained on bare spin configurations to the case of preprocessed samples with respect to vortex configurations. We discuss these findings in relation to similar recent investigations and possible further applications.
We study a phase transition in parameter learning of Hidden Markov Models (HMMs). We do this by generating sequences of observed symbols from given discrete HMMs with uniformly distributed transition probabilities and a noise level encoded in the output probabilities. By using the Baum-Welch (BW) algorithm, an Expectation-Maximization algorithm from the field of Machine Learning, we then try to estimate the parameters of each investigated realization of an HMM. We study HMMs with n=4, 8 and 16 states. By changing the amount of accessible learning data and the noise level, we observe a phase-transition-like change in the performance of the learning algorithm. For bigger HMMs and more learning data, the learning behavior improves tremendously below a certain threshold in the noise strength. For a noise level above the threshold, learning is not possible. Furthermore, we use an overlap parameter applied to the results of a maximum-a-posteriori (Viterbi) algorithm to investigate the accuracy of the hidden state estimation around the phase transition.
Preferential attachment is a central paradigm in the theory of complex networks. In this contribution we consider various generalizations of preferential attachment including for example node removal and edge rewiring. We demonstrate that generalized preferential attachment networks can undergo a topological phase transition. This transition separates networks having a power-law tail degree distribution from those with an exponential tail. The appearance of the phase transition is closely related to the breakdown of the continuous variable description of the network dynamics.
Machine learning (ML) and artificial intelligence (AI) have the remarkable ability to classify, recognize, and characterize complex patterns and trends in large data sets. Here, we adopt a subclass of machine learning methods viz., deep learnings and develop a general-purpose AI tool - dPOLY for analyzing molecular dynamics trajectory and predicting phases and phase transitions in polymers. An unsupervised deep neural network is used within this framework to map a molecular dynamics trajectory undergoing thermophysical treatment such as cooling, heating, drying, or compression to a lower dimension. A supervised deep neural network is subsequently developed based on the lower dimensional data to characterize the phases and phase transition. As a proof of concept, we employ this framework to study coil to globule transition of a model polymer system. We conduct coarse-grained molecular dynamics simulations to collect molecular dynamics trajectories of a single polymer chain over a wide range of temperatures and use dPOLY framework to predict polymer phases. The dPOLY framework accurately predicts the critical temperatures for the coil to globule transition for a wide range of polymer sizes. This method is generic and can be extended to capture various other phase transitions and dynamical crossovers in polymers and other soft materials.
Recent progress in the development of efficient computational algorithms to price financial derivatives is summarized. A first algorithm is based on a path integral approach to option pricing, while a second algorithm makes use of a neural network parameterization of option prices. The accuracy of the two methods is established from comparisons with the results of the standard procedures used in quantitative finance.
This Letter presents a neural estimator for entropy production, or NEEP, that estimates entropy production (EP) from trajectories of relevant variables without detailed information on the system dynamics. For steady state, we rigorously prove that the estimator, which can be built up from different choices of deep neural networks, provides stochastic EP by optimizing the objective function proposed here. We verify the NEEP with the stochastic processes of the bead-spring and discrete flashing ratchet models, and also demonstrate that our method is applicable to high-dimensional data and can provide coarse-grained EP for Markov systems with unobservable states.