ترغب بنشر مسار تعليمي؟ اضغط هنا

Efficient sampling of constrained high-dimensional theoretical spaces with machine learning

49   0   0.0 ( 0 )
 نشر من قبل Jacob Hollingsworth
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Models of physics beyond the Standard Model often contain a large number of parameters. These form a high-dimensional space that is computationally intractable to fully explore. Experimental constraints project onto a subspace of viable parameters, but mapping these constraints to the underlying parameters is also typically intractable. Instead, physicists often resort to scanning small subsets of the full parameter space and testing for experimental consistency. We propose an alternative approach that uses generative models to significantly improve the computational efficiency of sampling high-dimensional parameter spaces. To demonstrate this, we sample the constrained and phenomenological Minimal Supersymmetric Standard Models subject to the requirement that the sampled points are consistent with the measured Higgs boson mass. Our method achieves orders of magnitude improvements in sampling efficiency compared to a brute force search.



قيم البحث

اقرأ أيضاً

In recent years, there is a growing interest in using quantum computers for solving combinatorial optimization problems. In this work, we developed a generic, machine learning-based framework for mapping continuous-space inverse design problems into surrogate quadratic unconstrained binary optimization (QUBO) problems by employing a binary variational autoencoder and a factorization machine. The factorization machine is trained as a low-dimensional, binary surrogate model for the continuous design space and sampled using various QUBO samplers. Using the D-Wave Advantage hybrid sampler and simulated annealing, we demonstrate that by repeated resampling and retraining of the factorization machine, our framework finds designs that exhibit figures of merit exceeding those of its training set. We showcase the frameworks performance on two inverse design problems by optimizing (i) thermal emitter topologies for thermophotovoltaic applications and (ii) diffractive meta-gratings for highly efficient beam steering. This technique can be further scaled to leverage future developments in quantum optimization to solve advanced inverse design problems for science and engineering applications.
The latest techniques from Neural Networks and Support Vector Machines (SVM) are used to investigate geometric properties of Complete Intersection Calabi-Yau (CICY) threefolds, a class of manifolds that facilitate string model building. An advanced n eural network classifier and SVM are employed to (1) learn Hodge numbers and report a remarkable improvement over previous efforts, (2) query for favourability, and (3) predict discrete symmetries, a highly imbalanced problem to which both Synthetic Minority Oversampling Technique (SMOTE) and permutations of the CICY matrix are used to decrease the class imbalance and improve performance. In each case study, we employ a genetic algorithm to optimise the hyperparameters of the neural network. We demonstrate that our approach provides quick diagnostic tools capable of shortlisting quasi-realistic string models based on compactification over smooth CICYs and further supports the paradigm that classes of problems in algebraic geometry can be machine learned.
Systematic classification of Z2xZ2 orbifold compactifications of the heterotic-string was pursued by using its free fermion formulation. The method entails random generation of string vacua and analysis of their entire spectra, and led to discovery o f spinor-vector duality and three generation exophobic string vacua. The classification was performed for string vacua with unbroken SO(10) GUT symmetry, and progressively extended to models in which the SO(10) symmetry is broken to the SO(6)xSO(4), SU(5)xU(1), SU(3)xSU(2)xU(1)^2 and SU(3)xU(1)xSU(2)^2 subgroups. Obtaining sizeable number of phenomenologically viable vacua in the last two cases requires identification of fertility conditions. Adaptation of machine learning tools to identify the fertility conditions will be useful when the frequency of viable models becomes exceedingly small in the total space of vacua.
We reformulate entanglement wedge reconstruction in the language of operator-algebra quantum error correction with infinite-dimensional physical and code Hilbert spaces. Von Neumann algebras are used to characterize observables in a boundary subregio n and its entanglement wedge. Assuming that the infinite-dimensional von Neumann algebras associated with an entanglement wedge and its complement may both be reconstructed in their corresponding boundary subregions, we prove that the relative entropies measured with respect to the bulk and boundary observables are equal. We also prove the converse: when the relative entropies measured in an entanglement wedge and its complement equal the relative entropies measured in their respective boundary subregions, entanglement wedge reconstruction is possible. Along the way, we show that the bulk and boundary modular operators act on the code subspace in the same way. For holographic theories with a well-defined entanglement wedge, this result provides a well-defined notion of holographic relative entropy.
Factor models are a class of powerful statistical models that have been widely used to deal with dependent measurements that arise frequently from various applications from genomics and neuroscience to economics and finance. As data are collected at an ever-growing scale, statistical machine learning faces some new challenges: high dimensionality, strong dependence among observed variables, heavy-tailed variables and heterogeneity. High-dimensional robust factor analysis serves as a powerful toolkit to conquer these challenges. This paper gives a selective overview on recent advance on high-dimensional factor models and their applications to statistics including Factor-Adjusted Robust Model selection (FarmSelect) and Factor-Adjusted Robust Multiple testing (FarmTest). We show that classical methods, especially principal component analysis (PCA), can be tailored to many new problems and provide powerful tools for statistical estimation and inference. We highlight PCA and its connections to matrix perturbation theory, robust statistics, random projection, false discovery rate, etc., and illustrate through several applications how insights from these fields yield solutions to modern challenges. We also present far-reaching connections between factor models and popular statistical learning problems, including network analysis and low-rank matrix recovery.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا