Do you want to publish a course? Click here

Fermionic Neural Network with Effective Core Potential

90   0   0.0 ( 0 )
 Added by Xiang Li
 Publication date 2021
  fields Physics
and research's language is English




Ask ChatGPT about the research

Deep learning techniques have opened a new venue for electronic structure theory in recent years. In contrast to traditional methods, deep neural networks provide much more expressive and flexible wave function ansatz, resulting in better accuracy and time scaling behavior. In order to study larger systems while retaining sufficient accuracy, we integrate a powerful neural-network based model (FermiNet) with the effective core potential method, which helps to reduce the complexity of the problem by replacing inner core electrons with additional semi-local potential terms in Hamiltonian. In this work, we calculate the ground state energy of 3d transition metal atoms and their monoxide which are quite challenging for original FermiNet work, and the results are in good consistency with both experimental data and other state-of-the-art computational methods. Our development is an important step for a broader application of deep learning in the electronic structure calculation of molecules and materials.



rate research

Read More

We extend the range-separated double-hybrid RSH+MP2 method [J. G. Angyan et al., Phys. Rev. A 72, 012510 (2005)], combining long-range HF exchange and MP2 correlation with a short-range density functional, to a fully self-consistent version using the optimized-effective-potential technique in which the orbitals are obtained from a local potential including the long-range HF and MP2 contributions. We test this approach, that we name RS-OEP2, on a set of small closed-shell atoms and molecules. For the commonly used value of the range-separation parameter $mu=0.5$ bohr$^{-1}$, we find that self-consistency does not seem to bring any improvement for total energies, ionization potentials, and electronic affinities. However, contrary to the non-self-consistent RSH+MP2 method, the present RS-OEP2 method gives a LUMO energy which physically corresponds to a neutral excitation energy and gives local exchange-correlation potentials which are reasonably good approximations to the corresponding Kohn-Sham quantities. At a finer scale, we find that RS-OEP2 gives largely inaccurate correlation potentials and correlated densities, which points to the need of further improvement of this type of range-separated double hybrids.
170 - Yaolong Zhang , Ce Hu , Bin Jiang 2019
We propose a simple, but efficient and accurate machine learning (ML) model for developing high-dimensional potential energy surface. This so-called embedded atom neural network (EANN) approach is inspired by the well-known empirical embedded atom method (EAM) model used in condensed phase. It simply replaces the scalar embedded atom density in EAM with a Gaussian-type orbital based density vector, and represents the complex relationship between the embedded density vector and atomic energy by neural networks. We demonstrate that the EANN approach is equally accurate as several established ML models in representing both big molecular and extended periodic systems, yet with much fewer parameters and configurations. It is highly efficient as it implicitly contains the three-body information without an explicit sum of the conventional costly angular descriptors. With high accuracy and efficiency, EANN potentials can vastly accelerate molecular dynamics and spectroscopic simulations in complex systems at ab initio level.
Machine learning has revolutionized the high-dimensional representations for molecular properties such as potential energy. However, there are scarce machine learning models targeting tensorial properties, which are rotationally covariant. Here, we propose tensorial neural network (NN) models to learn both tensorial response and transition properties, in which atomic coordinate vectors are multiplied with scalar NN outputs or their derivatives to preserve the rotationally covariant symmetry. This strategy keeps structural descriptors symmetry invariant so that the resulting tensorial NN models are as efficient as their scalar counterparts. We validate the performance and universality of this approach by learning response properties of water oligomers and liquid water, and transition dipole moment of a model structural unit of proteins. Machine learned tensorial models have enabled efficient simulations of vibrational spectra of liquid water and ultraviolet spectra of realistic proteins, promising feasible and accurate spectroscopic simulations for biomolecules and materials.
One hidden yet important issue for developing neural network potentials (NNPs) is the choice of training algorithm. Here we compare the performance of two popular training algorithms, the adaptive moment estimation algorithm (Adam) and the extended Kalman filter algorithm (EKF), using the Behler-Parrinello neural network (BPNN) and two publicly accessible datasets of liquid water. It is found that NNPs trained with EKF are more transferable and less sensitive to the value of the learning rate, as compared to Adam. In both cases, error metrics of the test set do not always serve as a good indicator for the actual performance of NNPs. Instead, we show that their performance correlates well with a Fisher information based similarity measure.
The study of chemical reactions in aqueous media is very important for its implications in several fields of science, from biology to industrial processes. Modelling these reactions is however difficult when water directly participates in the reaction. Since it requires a fully quantum mechanical description of the system, $textit{ab-initio}$ molecular dynamics is the ideal candidate to shed light on these processes. However, its scope is limited by a high computational cost. A popular alternative is to perform molecular dynamics simulations powered by machine learning potentials, trained on an extensive set of quantum mechanical calculations. Doing so reliably for reactive processes is difficult because it requires including very many intermediate and transition state configurations. In this study, we used an active learning procedure accelerated by enhanced sampling to harvest such structures and to build a neural-network potential to study the urea decomposition process in water. This allowed us to obtain the free energy profiles of this important reaction in a wide range of temperatures, to discover a number of novel metastable states and to improve the accuracy of the kinetic rates calculations. Furthermore, we found that the formation of the zwitterionic intermediate has the same probability of occurring via an acidic or a basic pathway, which could be the cause of the insensitivity of reaction rates to the pH solution.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا