ترغب بنشر مسار تعليمي؟ اضغط هنا

MAISE: Construction of neural network interatomic models and evolutionary structure optimization

154   0   0.0 ( 0 )
 نشر من قبل Samad Hajinazar
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Module for ab initio structure evolution (MAISE) is an open-source package for materials modeling and prediction. The codes main feature is an automated generation of neural network (NN) interatomic potentials for use in global structure searches. The systematic construction of Behler-Parrinello-type NN models approximating ab initio energy and forces relies on two approaches introduced in our recent studies. An evolutionary sampling scheme for generating reference structures improves the NNs mapping of regions visited in unconstrained searches, while a stratified training approach enables the creation of standardized NN models for multiple elements. A more flexible NN architecture proposed here expands the applicability of the stratified scheme for an arbitrary number of elements. The full workflow in the NN development is managed with a customizable MAISE-NET wrapper written in Python. The global structure optimization capability in MAISE is based on an evolutionary algorithm applicable for nanoparticles, films, and bulk crystals. A multitribe extension of the algorithm allows for an efficient simultaneous optimization of nanoparticles in a given size range. Implemented structure analysis functions include fingerprinting with radial distribution functions and finding space groups with the SPGLIB tool. This work overviews MAISEs available features, constructed models, and confirmed predictions.



قيم البحث

اقرأ أيضاً

Recent application of neural networks (NNs) to modeling interatomic interactions has shown the learning machines encouragingly accurate performance for select elemental and multicomponent systems. In this study, we explore the possibility of building a library of NN-based models by introducing a hierarchical NN training. In such a stratified procedure NNs for multicomponent systems are obtained by sequential training from the bottom up: first unaries, then binaries, and so on. Advantages of constructing NN sets with shared parameters include acceleration of the training process and intact description of the constituent systems. We use an automated generation of diverse structure sets for NN training on density functional theory-level reference energies. In the test case of Cu, Pd, Ag, Cu-Pd, Cu-Ag, Pd-Ag, and Cu-Pd-Ag systems, NNs trained in the traditional and stratified fashions are found to have essentially identical accuracy for defect energies, phonon dispersions, formation energies, etc. The models robustness is further illustrated via unconstrained evolutionary structure searches in which the NN is used for the local optimization of crystal unit cells.
We investigate the use of invariant polynomials in the construction of data-driven interatomic potentials for material systems. The atomic body-ordered permutation-invariant polynomials (aPIPs) comprise a systematic basis and are constructed to prese rve the symmetry of the potential energy function with respect to rotations and permutations. In contrast to kernel based and artificial neural network models, the explicit decomposition of the total energy as a sum of atomic body-ordered terms allows to keep the dimensionality of the fit reasonably low, up to just 10 for the 5-body terms. The explainability of the potential is aided by this decomposition, as the low body-order components can be studied and interpreted independently. Moreover, although polynomial basis functions are thought to extrapolate poorly, we show that the low dimensionality combined with careful regularisation actually leads to better transferability than the high dimensional, kernel based Gaussian Approximation Potential.
This work presents Neural Equivariant Interatomic Potentials (NequIP), a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aw are models use invariant convolutions and only act on scalars, NequIP employs SE(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency. NequIP outperforms existing models with up to three orders of magnitude fewer training data, challenging the widely held belief that deep neural networks require massive training sets. The high data efficiency of the method allows for the construction of accurate potentials using high-order quantum chemical level of theory as reference and enables high-fidelity molecular dynamics simulations over long time scales.
Prediction of material properties from first principles is often a computationally expensive task. Recently, artificial neural networks and other machine learning approaches have been successfully employed to obtain accurate models at a low computati onal cost by leveraging existing example data. Here, we present a software package Properties from Artificial Neural Network Architectures (PANNA) that provides a comprehensive toolkit for creating neural network models for atomistic systems. Besides the core routines for neural network training, it includes data parser, descriptor builder and force-field generator suitable for integration within molecular dynamics packages. PANNA offers a variety of activation and cost functions, regularization methods, as well as the possibility of using fully-connected networks with custom size for each atomic species. PANNA benefits from the optimization and hardware-flexibility of the underlying TensorFlow engine which allows it to be used on multiple CPU/GPU/TPU systems, making it possible to develop and optimize neural network models based on large datasets.
StructOpt, an open-source structure optimization suite, applies genetic algorithm and particle swarm methods to obtain atomic structures that minimize an objective function. The objective function typically consists of the energy and the error betwee n simulated and experimental data, which is typically applied to determine structures that minimize energy to the extent possible while also being fully consistent with available experimental data. We present example use cases including the structure of a metastable Pt nanoparticle determined from energetic and scanning transmission electron microscopy data, and the structure of an amorphous-nanocrystal composite determined from energetic and fluctuation electron microscopy data. StructOpt is modular in its construction and therefore is naturally extensible to include new materials simulation modules or new optimization methods, either written by the user or existing in other code packages. It uses the Message Passing Interfaces (MPI) dynamic process management functionality to allocate resources to computationally expensive codes on the fly, enabling StructOpt to take full advantage of the parallelization tools available in many scientific packages.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا