Do you want to publish a course? Click here

Combining Machine Learning and Computational Chemistry for Predictive Insights Into Chemical Systems

81   0   0.0 ( 0 )
 Added by John Keith
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

Machine learning models are poised to make a transformative impact on chemical sciences by dramatically accelerating computational algorithms and amplifying insights available from computational chemistry methods. However, achieving this requires a confluence and coaction of expertise in computer science and physical sciences. This review is written for new and experienced researchers working at the intersection of both fields. We first provide concise tutorials of computational chemistry and machine learning methods, showing how insights involving both can be achieved. We then follow with a critical review of noteworthy applications that demonstrate how computational chemistry and machine learning can be used together to provide insightful (and useful) predictions in molecular and materials modeling, retrosyntheses, catalysis, and drug design.



rate research

Read More

Two types of approaches to modeling molecular systems have demonstrated high practical efficiency. Density functional theory (DFT), the most widely used quantum chemical method, is a physical approach predicting energies and electron densities of molecules. Recently, numerous papers on machine learning (ML) of molecular properties have also been published. ML models greatly outperform DFT in terms of computational costs, and may even reach comparable accuracy, but they are missing physicality - a direct link to Quantum Physics - which limits their applicability. Here, we propose an approach that combines the strong sides of DFT and ML, namely, physicality and low computational cost. By generalizing the famous Hohenberg-Kohn theorems, we derive general equations for exact electron densities and energies that can naturally guide applications of ML in Quantum Chemistry. Based on these equations, we build a deep neural network that can compute electron densities and energies of a wide range of organic molecules not only much faster, but also closer to exact physical values than curre
We investigated the electronic and structural properties of the infinite linear carbon chain (carbyne) using density functional theory (DFT) and the random phase approximation (RPA) to the correlation energy. The studies are performed in vacuo and for carbyne inside a carbon nano tube (CNT). In the vacuum, semi-local DFT and RPA predict bond length alternations of about 0.04 {AA} and 0.13 {AA}, respectively. The frequency of the highest optical mode at the $Gamma$ point is 1219 cm$^{-1}$ and about 2000 cm$^{-1}$ for DFT and the RPA. Agreement of the RPA to previous high level quantum chemistry and diffusion Monte-Carlo results is excellent. For the RPA we calculate the phonon-dispersion in the full Brillouine zone and find marked quantitative differences to DFT calculations not only at the $Gamma$ point but also throughout the entire Brillouine zone. To model carbyne inside a carbon nanotube, we considered a (10,0) CNT. Here the DFT calculations are even qualitatively sensitive to the k-points sampling. At the limes of a very dense k-points sampling, semi-local DFT predicts no bond length alternation (BLA), whereas in the RPA a sizeable BLA of 0.09 {AA} prevails. The reduced BLA leads to a significant red shift of the vibrational frequencies of about 350 cm$^{-1}$, so that they are in good agreement with experimental estimates. Overall, the good agreement between the RPA and previously reported results from correlated wavefunction methods and experimental Raman data suggests that the RPA provides reliable results at moderate computational costs. It hence presents a useful addition to the repertoire of correlated wavefunction methods and its accuracy clearly prevails for low dimensional systems, where semi-local density functionals struggle to yield even qualitatively correct results.
The applications of machine learning techniques to chemistry and materials science become more numerous by the day. The main challenge is to devise representations of atomic systems that are at the same time complete and concise, so as to reduce the number of reference calculations that are needed to predict the properties of different types of materials reliably. This has led to a proliferation of alternative ways to convert an atomic structure into an input for a machine-learning model. We introduce an abstract definition of chemical environments that is based on a smoothed atomic density, using a bra-ket notation to emphasize basis set independence and to highlight the connections with some popular choices of representations for describing atomic systems. The correlations between the spatial distribution of atoms and their chemical identities are computed as inner products between these feature kets, which can be given an explicit representation in terms of the expansion of the atom density on orthogonal basis functions, that is equivalent to the smooth overlap of atomic positions (SOAP) power spectrum, but also in real space, corresponding to $n$-body correlations of the atom density. This formalism lays the foundations for a more systematic tuning of the behavior of the representations, by introducing operators that represent the correlations between structure, composition, and the target properties. It provides a unifying picture of recent developments in the field and indicates a way forward towards more effective and computationally affordable machine-learning schemes for molecules and materials.
The sorption of radionuclides by graphene oxides synthesized by different methods was studied through a combination of batch experiments with characterization by microscopic and spectroscopic techniques such as X-ray photoelectron spectroscopy (XPS), attenuated total reflection fourier-transform infrared spectroscopy (ATR-FTIR), high-energy resolution fluorescence detected X-Ray absorption spectroscopy (HERFD-XANES), extended X-ray absorption fine structure (EXAFS) and high resolution transmission electron microscopy (HRTEM).
We present a supervised learning method to learn the propagator map of a dynamical system from partial and noisy observations. In our computationally cheap and easy-to-implement framework a neural network consisting of random feature maps is trained sequentially by incoming observations within a data assimilation procedure. By employing Takens embedding theorem, the network is trained on delay coordinates. We show that the combination of random feature maps and data assimilation, called RAFDA, outperforms standard random feature maps for which the dynamics is learned using batch data.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا