Do you want to publish a course? Click here

Ensemble Kalman Sampler: mean-field limit and convergence analysis

117   0   0.0 ( 0 )
 Added by Zhiyan Ding
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

Ensemble Kalman Sampler (EKS) is a method to find approximately $i.i.d.$ samples from a target distribution. As of today, why the algorithm works and how it converges is mostly unknown. The continuous version of the algorithm is a set of coupled stochastic differential equations (SDEs). In this paper, we prove the wellposedness of the SDE system, justify its mean-field limit is a Fokker-Planck equation, whose long time equilibrium is the target distribution. We further demonstrate that the convergence rate is near-optimal ($J^{-1/2}$, with $J$ being the number of particles). These results, combined with the in-time convergence of the Fokker-Planck equation to its equilibrium, justify the validity of EKS, and provide the convergence rate as a sampling method.



rate research

Read More

In this work we marry multi-index Monte Carlo with ensemble Kalman filtering (EnKF) to produce the multi-index EnKF method (MIEnKF). The MIEnKF method is based on independent samples of four-coupled EnKF estimators on a multi-index hierarchy of resolution levels, and it may be viewed as an extension of the multilevel EnKF (MLEnKF) method developed by the same authors in 2020. Multi-index here refers to a two-index method, consisting of a hierarchy of EnKF estimators that are coupled in two degrees of freedom: time discretization and ensemble size. Under certain assumptions, the MIEnKF method is proven to be more tractable than EnKF and MLEnKF, and this is also verified in numerical examples.
We present a novel algorithm based on the ensemble Kalman filter to solve inverse problems involving multiscale elliptic partial differential equations. Our method is based on numerical homogenization and finite element discretization and allows to recover a highly oscillatory tensor from measurements of the multiscale solution in a computationally inexpensive manner. The properties of the approximate solution are analysed with respect to the multiscale and discretization parameters, and a convergence result is shown to hold. A reinterpretation of the solution from a Bayesian perspective is provided, and convergence of the approximate conditional posterior distribution is proved with respect to the Wasserstein distance. A numerical experiment validates our methodology, with a particular emphasis on modelling error and computational cost.
156 - Zhiyan Ding , Qin Li , Jianfeng Lu 2020
Ensemble Kalman Inversion (EnKI) and Ensemble Square Root Filter (EnSRF) are popular sampling methods for obtaining a target posterior distribution. They can be seem as one step (the analysis step) in the data assimilation method Ensemble Kalman Filter. Despite their popularity, they are, however, not unbiased when the forward map is nonlinear. Important Sampling (IS), on the other hand, obtains the unbiased sampling at the expense of large variance of weights, leading to slow convergence of high moments. We propose WEnKI and WEnSRF, the weight
This work develops a new multifidelity ensemble Kalman filter (MFEnKF) algorithm based on linear control variate framework. The approach allows for rigorous multifidelity extensions of the EnKF, where the uncertainty in coarser fidelities in the hierarchy of models represent control variates for the uncertainty in finer fidelities. Small ensembles of high fidelity model runs are complemented by larger ensembles of cheaper, lower fidelity runs, to obtain much improved analyses at only small additional computational costs. We investigate the use of reduced order models as coarse fidelity control variates in the MFEnKF, and provide analyses to quantify the improvements over the traditional ensemble Kalman filters. We apply these ideas to perform data assimilation with a quasi-geostrophic test problem, using direct numerical simulation and a corresponding POD-Galerkin reduced order model. Numerical results show that the two-fidelity MFEnKF provides better analyses than existing EnKF algorithms at comparable or reduced computational costs.
Ensemble filters implement sequential Bayesian estimation by representing the probability distribution by an ensemble mean and covariance. Unbiased square root ensemble filters use deterministic algorithms to produce an analysis (posterior) ensemble with prescribed mean and covariance, consistent with the Kalman update. This includes several filters used in practice, such as the Ensemble Transform Kalman Filter (ETKF), the Ensemble Adjustment Kalman Filter (EAKF), and a filter by Whitaker and Hamill. We show that at every time index, as the number of ensemble members increases to infinity, the mean and covariance of an unbiased ensemble square root filter converge to those of the Kalman filter, in the case a linear model and an initial distribution of which all moments exist. The convergence is in $L^{p}$ and the convergence rate does not depend on the model dimension. The result holds in the infinitely dimensional Hilbert space as well.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا