Statistical Predictions in String Theory and Deep Generative Models


الملخص بالإنكليزية

Generative models in deep learning allow for sampling probability distributions that approximate data distributions. We propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of as learning random tensor approximations of couplings. As a concrete proof-of-principle, we demonstrate in a large ensemble of Calabi-Yau manifolds that Kahler metrics evaluated at points in Kahler moduli space are well-approximated by ensembles of matrices produced by a deep convolutional Wasserstein GAN. Accurate approximations of the Kahler metric eigenspectra are achieved with far fewer than $h^{11}$ Gaussian draws. Accurate extrapolation to values of $h^{11}$ outside the training set are achieved via a conditional GAN. Together, these results implicitly suggest the existence of strong correlations in the data, as might be expected if Reids fantasy is correct.

تحميل البحث