Do you want to publish a course? Click here

Symplectic Gaussian Process Regression of Hamiltonian Flow Maps

258   0   0.0 ( 0 )
 Added by Katharina Rath
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

We present an approach to construct appropriate and efficient emulators for Hamiltonian flow maps. Intended future applications are long-term tracing of fast charged particles in accelerators and magnetic plasma confinement configurations. The method is based on multi-output Gaussian process regression on scattered training data. To obtain long-term stability the symplectic property is enforced via the choice of the matrix-valued covariance function. Based on earlier work on spline interpolation we observe derivatives of the generating function of a canonical transformation. A product kernel produces an accurate implicit method, whereas a sum kernel results in a fast explicit method from this approach. Both correspond to a symplectic Euler method in terms of numerical integration. These methods are applied to the pendulum and the Henon-Heiles system and results compared to an symmetric regression with orthogonal polynomials. In the limit of small mapping times, the Hamiltonian function can be identified with a part of the generating function and thereby learned from observed time-series data of the systems evolution. Besides comparable performance of implicit kernel and spectral regression for symplectic maps, we demonstrate a substantial increase in performance for learning the Hamiltonian function compared to existing approaches.



rate research

Read More

Optical scatterometry is a method to measure the size and shape of periodic micro- or nanostructures on surfaces. For this purpose the geometry parameters of the structures are obtained by reproducing experimental measurement results through numerical simulations. We compare the performance of Bayesian optimization to different local minimization algorithms for this numerical optimization problem. Bayesian optimization uses Gaussian-process regression to find promising parameter values. We examine how pre-computed simulation results can be used to train the Gaussian process and to accelerate the optimization.
Vibrational properties of molecular crystals are constantly used as structural fingerprints, in order to identify both the chemical nature and the structural arrangement of molecules. The simulation of these properties is typically very costly, especially when dealing with response properties of materials to e.g. electric fields, which require a good description of the perturbed electronic density. In this work, we use Gaussian process regression (GPR) to predict the static polarizability and dielectric susceptibility of molecules and molecular crystals. We combine this framework with ab initio molecular dynamics to predict their anharmonic vibrational Raman spectra. We stress the importance of data representation, symmetry, and locality, by comparing the performance of different flavors of GPR. In particular, we show the advantages of using a recently developed symmetry-adapted version of GPR. As an examplary application, we choose Paracetamol as an isolated molecule and in different crystal forms. We obtain accurate vibrational Raman spectra in all cases with less than 1000 training points, and obtain improvements when using a GPR trained on the molecular monomer as a baseline for the crystal GPR models. Finally, we show that our methodology is transferable across polymorphic forms: we can train the model on data for one structure, and still be able to accurately predict the spectrum for a second polymorph. This procedure provides an independent route to access electronic structure properties when performing force-evaluations on empirical force-fields or machine-learned potential energy surfaces.
We introduce Latent Gaussian Process Regression which is a latent variable extension allowing modelling of non-stationary multi-modal processes using GPs. The approach is built on extending the input space of a regression problem with a latent variable that is used to modulate the covariance function over the training data. We show how our approach can be used to model multi-modal and non-stationary processes. We exemplify the approach on a set of synthetic data and provide results on real data from motion capture and geostatistics.
In this paper we introduce a novel model for Gaussian process (GP) regression in the fully Bayesian setting. Motivated by the ideas of sparsification, localization and Bayesian additive modeling, our model is built around a recursive partitioning (RP) scheme. Within each RP partition, a sparse GP (SGP) regression model is fitted. A Bayesian additive framework then combines multiple layers of partitioned SGPs, capturing both global trends and local refinements with efficient computations. The model addresses both the problem of efficiency in fitting a full Gaussian process regression model and the problem of prediction performance associated with a single SGP. Our approach mitigates the issue of pseudo-input selection and avoids the need for complex inter-block correlations in existing methods. The crucial trade-off becomes choosing between many simpler local model components or fewer complex global model components, which the practitioner can sensibly tune. Implementation is via a Metropolis-Hasting Markov chain Monte-Carlo algorithm with Bayesian back-fitting. We compare our model against popular alternatives on simulated and real datasets, and find the performance is competitive, while the fully Bayesian procedure enables the quantification of model uncertainties.
224 - Gecheng Chen , Rui Tuo 2020
A primary goal of computer experiments is to reconstruct the function given by the computer code via scattered evaluations. Traditional isotropic Gaussian process models suffer from the curse of dimensionality, when the input dimension is high. Gaussian process models with additive correlation functions are scalable to dimensionality, but they are very restrictive as they only work for additive functions. In this work, we consider a projection pursuit model, in which the nonparametric part is driven by an additive Gaussian process regression. The dimension of the additive function is chosen to be higher than the original input dimension. We show that this dimension expansion can help approximate more complex functions. A gradient descent algorithm is proposed to maximize the likelihood function. Simulation studies show that the proposed method outperforms the traditional Gaussian process models.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا