ﻻ يوجد ملخص باللغة العربية
Prediction of material properties from first principles is often a computationally expensive task. Recently, artificial neural networks and other machine learning approaches have been successfully employed to obtain accurate models at a low computational cost by leveraging existing example data. Here, we present a software package Properties from Artificial Neural Network Architectures (PANNA) that provides a comprehensive toolkit for creating neural network models for atomistic systems. Besides the core routines for neural network training, it includes data parser, descriptor builder and force-field generator suitable for integration within molecular dynamics packages. PANNA offers a variety of activation and cost functions, regularization methods, as well as the possibility of using fully-connected networks with custom size for each atomic species. PANNA benefits from the optimization and hardware-flexibility of the underlying TensorFlow engine which allows it to be used on multiple CPU/GPU/TPU systems, making it possible to develop and optimize neural network models based on large datasets.
Small metal clusters are of fundamental scientific interest and of tremendous significance in catalysis. These nanoscale clusters display diverse geometries and structural motifs depending on the cluster size; a knowledge of this size-dependent struc
We introduce a coarse-grained deep neural network model (CG-DNN) for liquid water that utilizes 50 rotational and translational invariant coordinates, and is trained exclusively against energies of ~30,000 bulk water configurations. Our CG-DNN potent
Module for ab initio structure evolution (MAISE) is an open-source package for materials modeling and prediction. The codes main feature is an automated generation of neural network (NN) interatomic potentials for use in global structure searches. Th
Structural, electronic, vibrational and dielectric properties of LaBGeO$_5$ with the stillwellite structure are determined based on textit{ab initio} density functional theory. The theoretically relaxed structure is found to agree well with the exist
In this paper we propose a Bayesian method for estimating architectural parameters of neural networks, namely layer size and network depth. We do this by learning concrete distributions over these parameters. Our results show that regular networks wi