ترغب بنشر مسار تعليمي؟ اضغط هنا

SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials

355   0   0.0 ( 0 )
 نشر من قبل Simon Lutz Batzner
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

This work presents Neural Equivariant Interatomic Potentials (NequIP), a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs SE(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency. NequIP outperforms existing models with up to three orders of magnitude fewer training data, challenging the widely held belief that deep neural networks require massive training sets. The high data efficiency of the method allows for the construction of accurate potentials using high-order quantum chemical level of theory as reference and enables high-fidelity molecular dynamics simulations over long time scales.



قيم البحث

اقرأ أيضاً

Abstract Machine learning models, trained on data from ab initio quantum simulations, are yielding molecular dynamics potentials with unprecedented accuracy. One limiting factor is the quantity of available training data, which can be expensive to ob tain. A quantum simulation often provides all atomic forces, in addition to the total energy of the system. These forces provide much more information than the energy alone. It may appear that training a model to this large quantity of force data would introduce significant computational costs. Actually, training to all available force data should only be a few times more expensive than training to energies alone. Here, we present a new algorithm for efficient force training, and benchmark its accuracy by training to forces from real-world datasets for organic chemistry and bulk aluminum.
We investigate the use of invariant polynomials in the construction of data-driven interatomic potentials for material systems. The atomic body-ordered permutation-invariant polynomials (aPIPs) comprise a systematic basis and are constructed to prese rve the symmetry of the potential energy function with respect to rotations and permutations. In contrast to kernel based and artificial neural network models, the explicit decomposition of the total energy as a sum of atomic body-ordered terms allows to keep the dimensionality of the fit reasonably low, up to just 10 for the 5-body terms. The explainability of the potential is aided by this decomposition, as the low body-order components can be studied and interpreted independently. Moreover, although polynomial basis functions are thought to extrapolate poorly, we show that the low dimensionality combined with careful regularisation actually leads to better transferability than the high dimensional, kernel based Gaussian Approximation Potential.
Module for ab initio structure evolution (MAISE) is an open-source package for materials modeling and prediction. The codes main feature is an automated generation of neural network (NN) interatomic potentials for use in global structure searches. Th e systematic construction of Behler-Parrinello-type NN models approximating ab initio energy and forces relies on two approaches introduced in our recent studies. An evolutionary sampling scheme for generating reference structures improves the NNs mapping of regions visited in unconstrained searches, while a stratified training approach enables the creation of standardized NN models for multiple elements. A more flexible NN architecture proposed here expands the applicability of the stratified scheme for an arbitrary number of elements. The full workflow in the NN development is managed with a customizable MAISE-NET wrapper written in Python. The global structure optimization capability in MAISE is based on an evolutionary algorithm applicable for nanoparticles, films, and bulk crystals. A multitribe extension of the algorithm allows for an efficient simultaneous optimization of nanoparticles in a given size range. Implemented structure analysis functions include fingerprinting with radial distribution functions and finding space groups with the SPGLIB tool. This work overviews MAISEs available features, constructed models, and confirmed predictions.
493 - Zun Wang , Chong Wang , Sibo Zhao 2021
Molecular dynamics is a powerful simulation tool to explore material properties. Most of the realistic material systems are too large to be simulated with first-principles molecular dynamics. Classical molecular dynamics has lower computational cost but requires accurate force fields to achieve chemical accuracy. In this work, we develop a symmetry-adapted graph neural networks framework, named molecular dynamics graph neural networks (MDGNN), to construct force fields automatically for molecular dynamics simulations for both molecules and crystals. This architecture consistently preserves the translation, rotation and permutation invariance in the simulations. We propose a new feature engineering method including higher order contributions and show that MDGNN accurately reproduces the results of both classical and first-principles molecular dynamics. We also demonstrate that force fields constructed by the model has good transferability. Therefore, MDGNN provides an efficient and promising option for molecular dynamics simulations of large scale systems with high accuracy.
288 - Chi Chen , Zhi Deng , Richard Tran 2017
In this work, we present a highly accurate spectral neighbor analysis potential (SNAP) model for molybdenum (Mo) developed through the rigorous application of machine learning techniques on large materials data sets. Despite Mos importance as a struc tural metal, existing force fields for Mo based on the embedded atom and modified embedded atom methods still do not provide satisfactory accuracy on many properties. We will show that by fitting to the energies, forces and stress tensors of a large density functional theory (DFT)-computed dataset on a diverse set of Mo structures, a Mo SNAP model can be developed that achieves close to DFT accuracy in the prediction of a broad range of properties, including energies, forces, stresses, elastic constants, melting point, phonon spectra, surface energies, grain boundary energies, etc. We will outline a systematic model development process, which includes a rigorous approach to structural selection based on principal component analysis, as well as a differential evolution algorithm for optimizing the hyperparameters in the model fitting so that both the model error and the property prediction error can be simultaneously lowered. We expect that this newly developed Mo SNAP model will find broad applications in large-scale, long-time scale simulations.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا