ترغب بنشر مسار تعليمي؟ اضغط هنا

Machine learning at the atomic-scale

139   0   0.0 ( 0 )
 نشر من قبل F\\'elix Musil
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Statistical learning algorithms are finding more and more applications in science and technology. Atomic-scale modeling is no exception, with machine learning becoming commonplace as a tool to predict energy, forces and properties of molecules and condensed-phase systems. This short review summarizes recent progress in the field, focusing in particular on the problem of representing an atomic configuration in a mathematically robust and computationally efficient way. We also discuss some of the regression algorithms that have been used to construct surrogate models of atomic-scale properties. We then show examples of how the optimization of the machine-learning models can both incorporate and reveal insights onto the physical phenomena that underlie structure-property relations.

قيم البحث

اقرأ أيضاً

The most successful and popular machine learning models of atomic-scale properties derive their transferability from a locality ansatz. The properties of a large molecule or a bulk material are written as a sum over contributions that depend on the c onfigurations within finite atom-centered environments. The obvious downside of this approach is that it cannot capture non-local, non-additive effects such as those arising due to long-range electrostatics or quantum interference. We propose a solution to this problem by introducing non-local representations of the system that are remapped as feature vectors that are defined locally and are equivariant in O(3). We consider in particular one form that has the same asymptotic behavior as the electrostatic potential. We demonstrate that this framework can capture non-local, long-range physics by building a model for the electrostatic energy of randomly distributed point-charges, for the unrelaxed binding curves of charged organic molecular dimers, and for the electronic dielectric response of liquid water. By combining a representation of the system that is sensitive to long-range correlations with the transferability of an atom-centered additive model, this method outperforms current state-of-the-art machine-learning schemes, and provides a conceptual framework to incorporate non-local physics into atomistic machine learning.
The applications of machine learning techniques to chemistry and materials science become more numerous by the day. The main challenge is to devise representations of atomic systems that are at the same time complete and concise, so as to reduce the number of reference calculations that are needed to predict the properties of different types of materials reliably. This has led to a proliferation of alternative ways to convert an atomic structure into an input for a machine-learning model. We introduce an abstract definition of chemical environments that is based on a smoothed atomic density, using a bra-ket notation to emphasize basis set independence and to highlight the connections with some popular choices of representations for describing atomic systems. The correlations between the spatial distribution of atoms and their chemical identities are computed as inner products between these feature kets, which can be given an explicit representation in terms of the expansion of the atom density on orthogonal basis functions, that is equivalent to the smooth overlap of atomic positions (SOAP) power spectrum, but also in real space, corresponding to $n$-body correlations of the atom density. This formalism lays the foundations for a more systematic tuning of the behavior of the representations, by introducing operators that represent the correlations between structure, composition, and the target properties. It provides a unifying picture of recent developments in the field and indicates a way forward towards more effective and computationally affordable machine-learning schemes for molecules and materials.
Two types of approaches to modeling molecular systems have demonstrated high practical efficiency. Density functional theory (DFT), the most widely used quantum chemical method, is a physical approach predicting energies and electron densities of mol ecules. Recently, numerous papers on machine learning (ML) of molecular properties have also been published. ML models greatly outperform DFT in terms of computational costs, and may even reach comparable accuracy, but they are missing physicality - a direct link to Quantum Physics - which limits their applicability. Here, we propose an approach that combines the strong sides of DFT and ML, namely, physicality and low computational cost. By generalizing the famous Hohenberg-Kohn theorems, we derive general equations for exact electron densities and energies that can naturally guide applications of ML in Quantum Chemistry. Based on these equations, we build a deep neural network that can compute electron densities and energies of a wide range of organic molecules not only much faster, but also closer to exact physical values than curre
68 - J. P. Coe 2019
The concept of machine learning configuration interaction (MLCI) [J. Chem. Theory Comput. 2018, 14, 5739], where an artificial neural network (ANN) learns on the fly to select important configurations, is further developed so that accurate ab initio potential energy curves can be efficiently calculated. This development includes employing the artificial neural network also as a hash function for the efficient deletion of duplicates on the fly so that the singles and doubles space does not need to be stored and this barrier to scalability is removed. In addition configuration state functions are introduced into the approach so that pure spin states are guaranteed, and the transferability of data between geometries is exploited. This improved approach is demonstrated on potential energy curves for the nitrogen molecule, water, and carbon monoxide. The results are compared with full configuration interaction values, when available, and different transfer protocols are investigated. It is shown that, for all of the considered systems, accurate potential energy curves can now be efficiently computed with MLCI. For the potential curves of N$_{2}$ and CO, MLCI can achieve lower errors than stochastically selecting configurations while also using substantially less processor hours.
Machine learning models are poised to make a transformative impact on chemical sciences by dramatically accelerating computational algorithms and amplifying insights available from computational chemistry methods. However, achieving this requires a c onfluence and coaction of expertise in computer science and physical sciences. This review is written for new and experienced researchers working at the intersection of both fields. We first provide concise tutorials of computational chemistry and machine learning methods, showing how insights involving both can be achieved. We then follow with a critical review of noteworthy applications that demonstrate how computational chemistry and machine learning can be used together to provide insightful (and useful) predictions in molecular and materials modeling, retrosyntheses, catalysis, and drug design.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا