RG inspired Machine Learning for lattice field theory


الملخص بالإنكليزية

Machine learning has been a fast growing field of research in several areas dealing with large datasets. We report recent attempts to use Renormalization Group (RG) ideas in the context of machine learning. We examine coarse graining procedures for perceptron models designed to identify the digits of the MNIST data. We discuss the correspondence between principal components analysis (PCA) and RG flows across the transition for worm configurations of the 2D Ising model. Preliminary results regarding the logarithmic divergence of the leading PCA eigenvalue were presented at the conference and have been improved after. More generally, we discuss the relationship between PCA and observables in Monte Carlo simulations and the possibility of reduction of the number of learning parameters in supervised learning based on RG inspired hierarchical ansatzes.

تحميل البحث