ﻻ يوجد ملخص باللغة العربية
Including prior knowledge is important for effective machine learning models in physics, and is usually achieved by explicitly adding loss terms or constraints on model architectures. Prior knowledge embedded in the physics computation itself rarely draws attention. We show that solving the Kohn-Sham equations when training neural networks for the exchange-correlation functional provides an implicit regularization that greatly improves generalization. Two separations suffice for learning the entire one-dimensional H$_2$ dissociation curve within chemical accuracy, including the strongly correlated region. Our models also generalize to unseen types of molecules and overcome self-interaction error.
Last year, at least 30,000 scientific papers used the Kohn-Sham scheme of density functional theory to solve electronic structure problems in a wide variety of scientific fields, ranging from materials science to biochemistry to astrophysics. Machine
A Kohn-Sham (KS) inversion determines a KS potential and orbitals corresponding to a given electron density, a procedure that has applications in developing and evaluating functionals used in density functional theory. Despite the utility of KS
In high temperature density functional theory simulations (from tens of eV to keV) the total number of Kohn-Sham orbitals is a critical quantity to get accurate results. To establish the relationship between the number of orbitals and the level of oc
A detailed account of the Kohn-Sham algorithm from quantum chemistry, formulated rigorously in the very general setting of convex analysis on Banach spaces, is given here. Starting from a Levy-Lieb-type functional, its convex and lower semi-continuou
Machine translation (MT) systems translate text between different languages by automatically learning in-depth knowledge of bilingual lexicons, grammar and semantics from the training examples. Although neural machine translation (NMT) has led the fi