Recent results have demonstrated that samplers constructed with flow-based generative models are a promising new approach for configuration generation in lattice field theory. In this paper, we present a set of methods to construct flow models for targets with multiple separated modes (i.e. theories with multiple vacua). We demonstrate the application of these methods to modeling two-dimensional real scalar field theory in its symmetry-broken phase. In this context we investigate the performance of different flow-based sampling algorithms, including a composite sampling algorithm where flow-based proposals are occasionally augmented by applying updates using traditional algorithms like HMC.
We define a class of machine-learned flow-based sampling algorithms for lattice gauge theories that are gauge-invariant by construction. We demonstrate the application of this framework to U(1) gauge theory in two spacetime dimensions, and find that near critical points in parameter space the approach is orders of magnitude more efficient at sampling topological quantities than more traditional sampling procedures such as Hybrid Monte Carlo and Heat Bath.
Algorithms based on normalizing flows are emerging as promising machine learning approaches to sampling complicated probability distributions in a way that can be made asymptotically exact. In the context of lattice field theory, proof-of-principle studies have demonstrated the effectiveness of this approach for scalar theories, gauge theories, and statistical systems. This work develops approaches that enable flow-based sampling of theories with dynamical fermions, which is necessary for the technique to be applied to lattice field theory studies of the Standard Model of particle physics and many condensed matter systems. As a practical demonstration, these methods are applied to the sampling of field configurations for a two-dimensional theory of massless staggered fermions coupled to a scalar field via a Yukawa interaction.
This notebook tutorial demonstrates a method for sampling Boltzmann distributions of lattice field theories using a class of machine learning models known as normalizing flows. The ideas and approaches proposed in arXiv:1904.12072, arXiv:2002.02428, and arXiv:2003.06413 are reviewed and a concrete implementation of the framework is presented. We apply this framework to a lattice scalar field theory and to U(1) gauge theory, explicitly encoding gauge symmetries in the flow-based approach to the latter. This presentation is intended to be interactive and working with the attached Jupyter notebook is recommended.
Machine learning has been a fast growing field of research in several areas dealing with large datasets. We report recent attempts to use Renormalization Group (RG) ideas in the context of machine learning. We examine coarse graining procedures for perceptron models designed to identify the digits of the MNIST data. We discuss the correspondence between principal components analysis (PCA) and RG flows across the transition for worm configurations of the 2D Ising model. Preliminary results regarding the logarithmic divergence of the leading PCA eigenvalue were presented at the conference and have been improved after. More generally, we discuss the relationship between PCA and observables in Monte Carlo simulations and the possibility of reduction of the number of learning parameters in supervised learning based on RG inspired hierarchical ansatzes.
We show that the partition function of all classical spin models, including all discrete Standard Statistical Models and all abelian discrete Lattice Gauge Theories (LGTs), can be expressed as a special instance of the partition function of the 4D Z_2 LGT. In this way, all classical spin models with apparently very different features are unified in a single complete model, and a physical relation between all models is established. As applications of this result, we present a new method to do mean field theory for abelian discrete LGTs with d>3, and we show that the computation of the partition function of the 4D Z_2 LGT is a computationally hard (#P-hard) problem. We also extend our results to abelian continuous models, where we show the approximate completeness of the 4D Z_2 LGT. All results are proven using quantum information techniques.
Daniel C. Hackett
,Chung-Chun Hsieh
,Michael S. Albergo
.
(2021)
.
"Flow-based sampling for multimodal distributions in lattice field theory"
.
Daniel Hackett
هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا