ﻻ يوجد ملخص باللغة العربية
We describe how simple machine learning methods successfully predict geometric properties from Hilbert series (HS). Regressors predict embedding weights in projective space to ${sim}1$ mean absolute error, whilst classifiers predict dimension and Gorenstein index to $>90%$ accuracy with ${sim}0.5%$ standard error. Binary random forest classifiers managed to distinguish whether the underlying HS describes a complete intersection with high accuracies exceeding $95%$. Neural networks (NNs) exhibited success identifying HS from a Gorenstein ring to the same order of accuracy, whilst generation of fake HS proved trivial for NNs to distinguish from those associated to the three-dimensional Fano varieties considered.
Classical and exceptional Lie algebras and their representations are among the most important tools in the analysis of symmetry in physical systems. In this letter we show how the computation of tensor products and branching rules of irreducible repr
The many ways in which machine and deep learning are transforming the analysis and simulation of data in particle physics are reviewed. The main methods based on boosted decision trees and various types of neural networks are introduced, and cutting-
We give three determinantal expressions for the Hilbert series as well as the Hilbert function of a Pfaffian ring, and a closed form product formula for its multiplicity. An appendix outlining some basic facts about degeneracy loci and applications t
Let $X$ be a compact Kahler manifold and $Lto X$ a quantizing holomorphic Hermitian line bundle. To immersed Lagrangian submanifolds $Lambda$ of $X$ satisfying a Bohr-Sommerfeld condition we associate sequences ${ |Lambda, krangle }_{k=1}^infty$, whe
Sets of zero-dimensional ideals in the polynomial ring $k[x,y]$ that share the same leading term ideal with respect to a given term ordering are known to be affine spaces called Grobner cells. Conca-Valla and Constantinescu parametrize such Grobner c