No Arabic abstract
The concept of chemical bonding can ultimately be seen as a rationalization of the recurring structural patterns observed in molecules and solids. Chemical intuition is nothing but the ability to recognize and predict such patterns, and how they transform into one another. Here we discuss how to use a computer to identify atomic patterns automatically, so as to provide an algorithmic definition of a bond based solely on structural information. We concentrate in particular on hydrogen bonding -- a central concept to our understanding of the physical chemistry of water, biological systems and many technologically important materials. Since the hydrogen bond is a somewhat fuzzy entity that covers a broad range of energies and distances, many different criteria have been proposed and used over the years, based either on sophisticate electronic structure calculations followed by an energy decomposition analysis, or on somewhat arbitrary choices of a range of structural parameters that is deemed to correspond to a hydrogen-bonded configuration. We introduce here a definition that is univocal, unbiased, and adaptive, based on our machine-learning analysis of an atomistic simulation. The strategy we propose could be easily adapted to similar scenarios, where one has to recognize or classify structural patterns in a material or chemical compound.
Machine learning is used to approximate the kinetic energy of one dimensional diatomics as a functional of the electron density. The functional can accurately dissociate a diatomic, and can be systematically improved with training. Highly accurate self-consistent densities and molecular forces are found, indicating the possibility for ab-initio molecular dynamics simulations.
Many atomic liquids can form transient covalent bonds reminiscent of those in the corresponding solid states. These directional interactions dictate many important properties of the liquid state, necessitating a quantitative, atomic-scale understanding of bonding in these complex systems. A prototypical example is liquid silicon, wherein transient covalent bonds give rise to local tetrahedral order and consequent non-trivial effects on liquid state thermodynamics and dynamics. To further understand covalent bonding in liquid silicon, and similar liquids, we present an ab initio simulation-based approach for quantifying the structure and dynamics of covalent bonds in condensed phases. Through the examination of structural correlations among silicon nuclei and maximally localized Wannier function centers, we develop a geometric criterion for covalent bonds in liquid Si. We use this to monitor the dynamics of transient covalent bonding in the liquid state and estimate a covalent bond lifetime. We compare covalent bond dynamics to other processes in liquid Si and similar liquids and suggest experiments to measure the covalent bond lifetime.
Halogen bonding has emerged as an important noncovalent interaction in a myriad of applications, including drug design, supramolecular assembly, and catalysis. Current understanding of the halogen bond is informed by electronic structure calculations on isolated molecules and/or crystal structures that are not readily transferable to liquids and disordered phases. To address this issue, we present a first-principles simulation-based approach for quantifying halogen bonds in molecular systems rooted in an understanding of nuclei-nuclei and electron-nuclei spatial correlations. We then demonstrate how this approach can be used to quantify the structure and dynamics of halogen bonds in condensed phases, using solid and liquid molecular chlorine as prototypical examples with high concentrations of halogen bonds. We close with a discussion of how the knowledge generated by our first-principles approach may inform the development of classical empirical models, with a consistent representation of halogen bonding.
Machine learning is a powerful tool to design accurate, highly non-local, exchange-correlation functionals for density functional theory. So far, most of those machine learned functionals are trained for systems with an integer number of particles. As such, they are unable to reproduce some crucial and fundamental aspects, such as the explicit dependency of the functionals on the particle number or the infamous derivative discontinuity at integer particle numbers. Here we propose a solution to these problems by training a neural network as the universal functional of density-functional theory that (i) depends explicitly on the number of particles with a piece-wise linearity between the integer numbers and (ii) reproduces the derivative discontinuity of the exchange-correlation energy. This is achieved by using an ensemble formalism, a training set containing fractional densities, and an explicitly discontinuous formulation.
We present a variational MonteCarlo (VMC) and lattice regularized diffusion MonteCarlo (LRDMC) study of the binding energy and dispersion curve of the water dimer. As a variation ansatz we use the JAGP wave function, an implementation of the resonating valence bond (RVB) idea. Actually one the aim of the present work is to investigate how the bonding of two water molecules, as a prototype of the hydrogen-bonded complexes, could be described within an JAGP approach. Using a pseudopotential for the inert core of the Oxygen, with a full optimization of the variational parameters, we obtain at the VMC level a binding energy of -4.5(0.1) Kcal/mol, while LRDMC calculations gives -4.9(0.1) Kcal/mol (experiment 5 Kcal/Mol). The calculated dispersion curve reproduces both at the VMC and LRDMC level the miminum position and the curvature.The quality of the WF gives us the possibility to dissect the binding energy in different contributions by appropriately switching off determinantal and Jastrow terms in the JAGP: we estimate the dynamical contribution to the binding energy to be of the order of 1.4(0.2) Kcal/Mol whereas the covalent contribution about 1.0(0.2) Kcal/Mol. JAGP reveales thus a promising WF for describing systems where both dispersive and covalent forces play an important role.