ترغب بنشر مسار تعليمي؟ اضغط هنا

Regularised Atomic Body-Ordered Permutation-Invariant Polynomials for the Construction of Interatomic Potentials

107   0   0.0 ( 0 )
 نشر من قبل Cas Van Der Oord
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We investigate the use of invariant polynomials in the construction of data-driven interatomic potentials for material systems. The atomic body-ordered permutation-invariant polynomials (aPIPs) comprise a systematic basis and are constructed to preserve the symmetry of the potential energy function with respect to rotations and permutations. In contrast to kernel based and artificial neural network models, the explicit decomposition of the total energy as a sum of atomic body-ordered terms allows to keep the dimensionality of the fit reasonably low, up to just 10 for the 5-body terms. The explainability of the potential is aided by this decomposition, as the low body-order components can be studied and interpreted independently. Moreover, although polynomial basis functions are thought to extrapolate poorly, we show that the low dimensionality combined with careful regularisation actually leads to better transferability than the high dimensional, kernel based Gaussian Approximation Potential.

قيم البحث

اقرأ أيضاً

Module for ab initio structure evolution (MAISE) is an open-source package for materials modeling and prediction. The codes main feature is an automated generation of neural network (NN) interatomic potentials for use in global structure searches. Th e systematic construction of Behler-Parrinello-type NN models approximating ab initio energy and forces relies on two approaches introduced in our recent studies. An evolutionary sampling scheme for generating reference structures improves the NNs mapping of regions visited in unconstrained searches, while a stratified training approach enables the creation of standardized NN models for multiple elements. A more flexible NN architecture proposed here expands the applicability of the stratified scheme for an arbitrary number of elements. The full workflow in the NN development is managed with a customizable MAISE-NET wrapper written in Python. The global structure optimization capability in MAISE is based on an evolutionary algorithm applicable for nanoparticles, films, and bulk crystals. A multitribe extension of the algorithm allows for an efficient simultaneous optimization of nanoparticles in a given size range. Implemented structure analysis functions include fingerprinting with radial distribution functions and finding space groups with the SPGLIB tool. This work overviews MAISEs available features, constructed models, and confirmed predictions.
This work presents Neural Equivariant Interatomic Potentials (NequIP), a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aw are models use invariant convolutions and only act on scalars, NequIP employs SE(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency. NequIP outperforms existing models with up to three orders of magnitude fewer training data, challenging the widely held belief that deep neural networks require massive training sets. The high data efficiency of the method allows for the construction of accurate potentials using high-order quantum chemical level of theory as reference and enables high-fidelity molecular dynamics simulations over long time scales.
Machine learning of atomic-scale properties is revolutionizing molecular modelling, making it possible to evaluate inter-atomic potentials with first-principles accuracy, at a fraction of the costs. The accuracy, speed and reliability of machine-lear ning potentials, however, depends strongly on the way atomic configurations are represented, i.e. the choice of descriptors used as input for the machine learning method. The raw Cartesian coordinates are typically transformed in fingerprints, or symmetry functions, that are designed to encode, in addition to the structure, important properties of the potential-energy surface like its invariances with respect to rotation, translation and permutation of like atoms. Here we discuss automatic protocols to select a number of fingerprints out of a large pool of candidates, based on the correlations that are intrinsic to the training data. This procedure can greatly simplify the construction of neural network potentials that strike the best balance between accuracy and computational efficiency, and has the potential to accelerate by orders of magnitude the evaluation of Gaussian Approximation Potentials based on the Smooth Overlap of Atomic Positions kernel. We present applications to the construction of neural network potentials for water and for an Al-Mg-Si alloy, and to the prediction of the formation energies of small organic molecules using Gaussian process regression.
We present a permutation-invariant distance between atomic configurations, defined through a functional representation of atomic positions. This distance enables to directly compare different atomic environments with an arbitrary number of particles, without going through a space of reduced dimensionality (i.e. fingerprints) as an intermediate step. Moreover, this distance is naturally invariant through permutations of atoms, avoiding the time consuming associated minimization required by other common criteria (like the Root Mean Square Distance). Finally, the invariance through global rotations is accounted for by a minimization procedure in the space of rotations solved by Monte Carlo simulated annealing. A formal framework is also introduced, showing that the distance we propose verifies the property of a metric on the space of atomic configurations. Two examples of applications are proposed. The first one consists in evaluating faithfulness of some fingerprints (or descriptors), i.e. their capacity to represent the structural information of a configuration. The second application concerns structural analysis, where our distance proves to be efficient in discriminating different local structures and even classifying their degree of similarity.
Parameterization of interatomic forcefields is a necessary first step in performing molecular dynamics simulations. This is a non-trivial global optimization problem involving quantification of multiple empirical variables against one or more propert ies. We present EZFF, a lightweight Python library for parameterization of several types of interatomic forcefields implemented in several molecular dynamics engines against multiple objectives using genetic-algorithm-based global optimization methods. The EZFF scheme provides unique functionality such as the parameterization of hybrid forcefields composed of multiple forcefield interactions as well as built-in quantification of uncertainty in forcefield parameters and can be easily extended to other forcefield functional forms as well as MD engines.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا