ترغب بنشر مسار تعليمي؟ اضغط هنا

3D Deep Learning with voxelized atomic configurations for modeling atomistic potentials in complex solid-solution alloys

56   0   0.0 ( 0 )
 نشر من قبل Aditya Balu
 تاريخ النشر 2018
والبحث باللغة English




اسأل ChatGPT حول البحث

The need for advanced materials has led to the development of complex, multi-component alloys or solid-solution alloys. These materials have shown exceptional properties like strength, toughness, ductility, electrical and electronic properties. Current development of such material systems are hindered by expensive experiments and computationally demanding first-principles simulations. Atomistic simulations can provide reasonable insights on properties in such material systems. However, the issue of designing robust potentials still exists. In this paper, we explore a deep convolutional neural-network based approach to develop the atomistic potential for such complex alloys to investigate materials for insights into controlling properties. In the present work, we propose a voxel representation of the atomic configuration of a cell and design a 3D convolutional neural network to learn the interaction of the atoms. Our results highlight the performance of the 3D convolutional neural network and its efficacy in machine-learning the atomistic potential. We also explore the role of voxel resolution and provide insights into the two bounding box methodologies implemented for voxelization.



قيم البحث

اقرأ أيضاً

We propose a novel active learning scheme for automatically sampling a minimum number of uncorrelated configurations for fitting the Gaussian Approximation Potential (GAP). Our active learning scheme consists of an unsupervised machine learning (ML) scheme coupled to Bayesian optimization technique that evaluates the GAP model. We apply this scheme to a Hafnium dioxide (HfO2) dataset generated from a melt-quench ab initio molecular dynamics (AIMD) protocol. Our results show that the active learning scheme, with no prior knowledge of the dataset is able to extract a configuration that reaches the required energy fit tolerance. Further, molecular dynamics (MD) simulations performed using this active learned GAP model on 6144-atom systems of amorphous and liquid state elucidate the structural properties of HfO2 with near ab initio precision and quench rates (i.e. 1.0 K/ps) not accessible via AIMD. The melt and amorphous x-ray structural factors generated from our simulation are in good agreement with experiment. Additionally, the calculated diffusion constants are in good agreement with previous ab initio studies.
293 - F. Rosch , 2007
Molecular dynamics simulations of crack propagation are performed for two extreme cases of complex metallic alloys (CMAs): In a model quasicrystal the structure is determined by clusters of atoms, whereas the model C15 Laves phase is a simple periodi c stacking of a unit cell. The simulations reveal that the basic building units of the structures also govern their fracture behaviour. Atoms in the Laves phase play a comparable role to the clusters in the quasicrystal. Although the latter are not rigid units, they have to be regarded as significant physical entities.
We present a novel deep learning (DL) approach to produce highly accurate predictions of macroscopic physical properties of solid solution binary alloys and magnetic systems. The major idea is to make use of the correlations between different physica l properties in alloy systems to improve the prediction accuracy of neural network (NN) models. We use multitasking NN models to simultaneously predict the total energy, charge density and magnetic moment. These physical properties mutually serve as constraints during the training of the multitasking NN, resulting in more reliable DL models because multiple physics properties are correctly learned by a single model. Two binary alloys, copper-gold (CuAu) and iron-platinum (FePt), were studied. Our results show that once the multitasking NNs are trained, they can estimate the material properties for a specific configuration hundreds of times faster than first-principles density functional theory calculations while retaining comparable accuracy. We used a simple measure based on the root-mean-squared errors (RMSE) to quantify the quality of the NN models, and found that the inclusion of charge density and magnetic moment as physical constraints leads to more stable models that exhibit improved accuracy and reduced uncertainty for the energy predictions.
Machine learning of atomic-scale properties is revolutionizing molecular modelling, making it possible to evaluate inter-atomic potentials with first-principles accuracy, at a fraction of the costs. The accuracy, speed and reliability of machine-lear ning potentials, however, depends strongly on the way atomic configurations are represented, i.e. the choice of descriptors used as input for the machine learning method. The raw Cartesian coordinates are typically transformed in fingerprints, or symmetry functions, that are designed to encode, in addition to the structure, important properties of the potential-energy surface like its invariances with respect to rotation, translation and permutation of like atoms. Here we discuss automatic protocols to select a number of fingerprints out of a large pool of candidates, based on the correlations that are intrinsic to the training data. This procedure can greatly simplify the construction of neural network potentials that strike the best balance between accuracy and computational efficiency, and has the potential to accelerate by orders of magnitude the evaluation of Gaussian Approximation Potentials based on the Smooth Overlap of Atomic Positions kernel. We present applications to the construction of neural network potentials for water and for an Al-Mg-Si alloy, and to the prediction of the formation energies of small organic molecules using Gaussian process regression.
Machine learning (ML) methods are becoming integral to scientific inquiry in numerous disciplines, such as material sciences. In this manuscript, we demonstrate how ML can be used to predict several properties in solid-state chemistry, in particular the heat of formation of a given complex crystallographic phase (here the $sigma-$phase, $tP30$, $D8_{b}$). Based on an independent and unprecedented large first principles dataset containing about 10,000 $sigma-$compounds with $n=14$ different elements, we used a supervised learning approach, to predict all the $sim$500,000 possible configurations within a mean absolute error of 23 meV/at ($sim$2 kJ.mol$^{-1}$) on the heat of formation and $sim$0.06 Ang. on the tetragonal cell parameters. We showed that neural network regression algorithms provide a significant improvement in accuracy of the predicted output compared to traditional regression techniques. Adding descriptors having physical nature (atomic radius, number of valence electrons) improves the learning precision. Based on our analysis, the training database composed of the only binary-compositions plays a major role in predicting the higher degree system configurations. Our result opens a broad avenue to efficient high-throughput investigations of the combinatorial binary calculation for multicomponent prediction of a complex phase.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا