Do you want to publish a course? Click here

Bridging microscopy with molecular dynamics and quantum simulations: An AtomAI based pipeline

161   0   0.0 ( 0 )
 Added by Ayana Ghosh
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

Recent advances in (scanning) transmission electron microscopy have enabled routine generation of large volumes of high-veracity structural data on 2D and 3D materials, naturally offering the challenge of using these as starting inputs for atomistic simulations. In this fashion, theory will address experimentally emerging structures, as opposed to the full range of theoretically possible atomic configurations. However, this challenge is highly non-trivial due to the extreme disparity between intrinsic time scales accessible to modern simulations and microscopy, as well as latencies of microscopy and simulations per se. Addressing this issue requires as a first step bridging the instrumental data flow and physics-based simulation environment, to enable the selection of regions of interest and exploring them using physical simulations. Here we report the development of the machine learning workflow that directly bridges the instrument data stream into Python-based molecular dynamics and density functional theory environments using pre-trained neural networks to convert imaging data to physical descriptors. The pathways to ensure the structural stability and compensate for the observational biases universally present in the data are identified in the workflow. This approach is used for a graphene system to reconstruct optimized geometry and simulate temperature-dependent dynamics including adsorption of Cr as an ad-atom and graphene healing effects. However, it is universal and can be used for other material systems.



rate research

Read More

AtomAI is an open-source software package bridging instrument-specific Python libraries, deep learning, and simulation tools into a single ecosystem. AtomAI allows direct applications of the deep convolutional neural networks for atomic and mesoscopic image segmentation converting image and spectroscopy data into class-based local descriptors for downstream tasks such as statistical and graph analysis. For atomically-resolved imaging data, the output is types and positions of atomic species, with an option for subsequent refinement. AtomAI further allows the implementation of a broad range of image and spectrum analysis functions, including invariant variational autoencoders (VAEs). The latter consists of VAEs with rotational and (optionally) translational invariance for unsupervised and class-conditioned disentanglement of categorical and continuous data representations. In addition, AtomAI provides utilities for mapping structure-property relationships via im2spec and spec2im type of encoder-decoder models. Finally, AtomAI allows seamless connection to the first principles modeling with a Python interface, including molecular dynamics and density functional theory calculations on the inferred atomic position. While the majority of applications to date were based on atomically resolved electron microscopy, the flexibility of AtomAI allows straightforward extension towards the analysis of mesoscopic imaging data once the labels and feature identification workflows are established/available. The source code and example notebooks are available at https://github.com/pycroscopy/atomai.
Tensor cores, along with tensor processing units, represent a new form of hardware acceleration specifically designed for deep neural network calculations in artificial intelligence applications. Tensor cores provide extraordinary computational speed and energy efficiency, but with the caveat that they were designed for tensor contractions (matrix-matrix multiplications) using only low-precision floating point operations. In spite of this, we demonstrate how tensor cores can be applied with high efficiency to the challenging and numerically sensitive problem of quantum-based Born-Oppenheimer molecular dynamics, which requires highly accurate electronic structure optimizations and conservative force evaluations. The interatomic forces are calculated on-the-fly from an electronic structure that is obtained from a generalized deep neural network, where the computational structure naturally takes advantage of the exceptional processing power of the tensor cores and allows for high performance in excess of 100 Tflops on the tensor cores of a single Nvidia A100 GPU. Stable molecular dynamics trajectories are generated using the framework of extended Lagrangian Born-Oppenheimer molecular dynamics, which combines computational efficiency with long-term stability, even when using approximate charge relaxations and force evaluations that are limited in accuracy by the numerically noisy conditions caused by the low precision tensor core floating-point operations. A canonical ensemble simulation scheme is also presented, where the additional numerical noise in the calculated forces is absorbed into a Langevin-like dynamics.
A novel version of the Continuous-Time Random Walk (CTRW) model with memory is developed. This memory means the dependence between arbitrary number of successive jumps of the process, while waiting times between jumps are considered as i.i.d. random variables. The dependence was found by analysis of empirical histograms for the stochastic process of a single share price on a market within the high frequency time scale, and justified theoretically by considering bid-ask bounce mechanism containing some delay characteristic for any double-auction market. Our model turns out to be exactly analytically solvable, which enables a direct comparison of its predictions with their empirical counterparts, for instance, with empirical velocity autocorrelation function. Thus this paper significantly extends the capabilities of the CTRW formalism.
Analog forecasting is a nonparametric technique introduced by Lorenz in 1969 which predicts the evolution of states of a dynamical system (or observables defined on the states) by following the evolution of the sample in a historical record of observations which most closely resembles the current initial data. Here, we introduce a suite of forecasting methods which improve traditional analog forecasting by combining ideas from kernel methods developed in harmonic analysis and machine learning and state-space reconstruction for dynamical systems. A key ingredient of our approach is to replace single-analog forecasting with weighted ensembles of analogs constructed using local similarity kernels. The kernels used here employ a number of dynamics-dependent features designed to improve forecast skill, including Takens delay-coordinate maps (to recover information in the initial data lost through partial observations) and a directional dependence on the dynamical vector field generating the data. Mathematically, our approach is closely related to kernel methods for out-of-sample extension of functions, and we discuss alternative strategies based on the Nystrom method and the multiscale Laplacian pyramids technique. We illustrate these techniques in applications to forecasting in a low-order deterministic model for atmospheric dynamics with chaotic metastability, and interannual-scale forecasting in the North Pacific sector of a comprehensive climate model. We find that forecasts based on kernel-weighted ensembles have significantly higher skill than the conventional approach following a single analog.
Crystal defects play a large role in how materials respond to their surroundings, yet there are many uncertainties in how extended defects form, move, and interact deep beneath a materials surface. A newly developed imaging diagnostic, dark-field X-ray microscopy (DFXM) can now visualize the behavior of line defects, known as dislocations, in materials under varying conditions. DFXM images visualize dislocations by imaging the very subtle long-range distortions in the materials crystal lattice, which produce a characteristic adjoined pair of bright and dark regions. Full analysis of how these dislocations evolve can be used to refine material models, however, it requires quantitative characterization of the statistics of their shape, position and motion. In this paper, we present a semi-automated approach to effectively isolate, track, and quantify the behavior of dislocations as composite objects. This analysis drives the statistical characterization of the defects, to include dislocation velocity and orientation in the crystal, for example, and is demonstrated on DFXM images measuring the evolution of defects at 98$%$ of the melting temperature for single-crystal aluminum, collected at the European Synchrotron Radiation Facility.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا