Do you want to publish a course? Click here

Materials development by interpretable machine learning

339   0   0.0 ( 0 )
 Added by Yuma Iwasaki
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

Machine learning technologies are expected to be great tools for scientific discoveries. In particular, materials development (which has brought a lot of innovation by finding new and better functional materials) is one of the most attractive scientific fields. To apply machine learning to actual materials development, collaboration between scientists and machine learning is becoming inevitable. However, such collaboration has been restricted so far due to black box machine learning, in which it is difficult for scientists to interpret the data-driven model from the viewpoint of material science and physics. Here, we show a material development success story that was achieved by good collaboration between scientists and one type of interpretable (explainable) machine learning called factorized asymptotic Bayesian inference hierarchical mixture of experts (FAB/HMEs). Based on material science and physics, we interpreted the data-driven model constructed by the FAB/HMEs, so that we discovered surprising correlation and knowledge about thermoelectric material. Guided by this, we carried out actual material synthesis that led to identification of a novel spin-driven thermoelectric material with the largest thermopower to date.



rate research

Read More

198 - Dane Morgan , Ryan Jacobs 2020
Advances in machine learning have impacted myriad areas of materials science, ranging from the discovery of novel materials to the improvement of molecular simulations, with likely many more important developments to come. Given the rapid changes in this field, it is challenging to understand both the breadth of opportunities as well as best practices for their use. In this review, we address aspects of both problems by providing an overview of the areas where machine learning has recently had significant impact in materials science, and then provide a more detailed discussion on determining the accuracy and domain of applicability of some common types of machine learning models. Finally, we discuss some opportunities and challenges for the materials community to fully utilize the capabilities of machine learning.
Lattice constants such as unit cell edge lengths and plane angles are important parameters of the periodic structures of crystal materials. Predicting crystal lattice constants has wide applications in crystal structure prediction and materials property prediction. Previous work has used machine learning models such as neural networks and support vector machines combined with composition features for lattice constant prediction and has achieved a maximum performance for cubic structures with an average $R^2$ of 0.82. Other models tailored for special materials family of a fixed form such as ABX3 perovskites can achieve much higher performance due to the homogeneity of the structures. However, these models trained with small datasets are usually not applicable to generic lattice parameter prediction of materials with diverse compositions. Herein, we report MLatticeABC, a random forest machine learning model with a new descriptor set for lattice unit cell edge length ($a,b,c$) prediction which achieves an R2 score of 0.979 for lattice parameter $a$ of cubic crystals and significant performance improvement for other crystal systems as well. Source code and trained models can be freely accessed at https://github.com/usccolumbia/MLatticeABC
Machine learning approaches, enabled by the emergence of comprehensive databases of materials properties, are becoming a fruitful direction for materials analysis. As a result, a plethora of models have been constructed and trained on existing data to predict properties of new systems. These powerful methods allow researchers to target studies only at interesting materials $unicode{x2014}$ neglecting the non-synthesizable systems and those without the desired properties $unicode{x2014}$ thus reducing the amount of resources spent on expensive computations and/or time-consuming experimental synthesis. However, using these predictive models is not always straightforward. Often, they require a panoply of technical expertise, creating barriers for general users. AFLOW-ML (AFLOW $underline{mathrm{M}}$achine $underline{mathrm{L}}$earning) overcomes the problem by streamlining the use of the machine learning methods developed within the AFLOW consortium. The framework provides an open RESTful API to directly access the continuously updated algorithms, which can be transparently integrated into any workflow to retrieve predictions of electronic, thermal and mechanical properties. These types of interconnected cloud-based applications are envisioned to be capable of further accelerating the adoption of machine learning methods into materials development.
110 - Hitarth Choubisa 2021
Machine learning models of materials$^{1-5}$ accelerate discovery compared to ab initio methods: deep learning models now reproduce density functional theory (DFT)-calculated results at one hundred thousandths of the cost of DFT$^{6}$. To provide guidance in experimental materials synthesis, these need to be coupled with an accurate yet effective search algorithm and training data consistent with experimental observations. Here we report an evolutionary algorithm powered search which uses machine-learned surrogate models trained on high-throughput hybrid functional DFT data benchmarked against experimental bandgaps: Deep Adaptive Regressive Weighted Intelligent Network (DARWIN). The strategy enables efficient search over the materials space of ~10$^8$ ternaries and 10$^{11}$ quaternaries$^{7}$ for candidates with target properties. It provides interpretable design rules, such as our finding that the difference in the electronegativity between the halide and B-site cation being a strong predictor of ternary structural stability. As an example, when we seek UV emission, DARWIN predicts K$_2$CuX$_3$ (X = Cl, Br) as a promising materials family, based on its electronegativity difference. We synthesized and found these materials to be stable, direct bandgap UV emitters. The approach also allows knowledge distillation for use by humans.
We use a machine learning approach to identify the importance of microstructure characteristics in causing magnetization reversal in ideally structured large-grained Nd$_2$Fe$_{14}$B permanent magnets. The embedded Stoner-Wohlfarth method is used as a reduced order model for determining local switching field maps which guide the data-driven learning procedure. The predictor model is a random forest classifier which we validate by comparing with full micromagnetic simulations in the case of small granular test structures. In the course of the machine learning microstructure analysis the most important features explaining magnetization reversal were found to be the misorientation and the position of the grain within the magnet. The lowest switching fields occur near the top and bottom edges of the magnet. While the dependence of the local switching field on the grain orientation is known from theory, the influence of the position of the grain on the local coercive field strength is less obvious. As a direct result of our findings of the machine learning analysis we show that edge hardening via Dy-diffusion leads to higher coercive fields.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا