No Arabic abstract
Various methods have been proposed for the nonlinear filtering problem, including the extended Kalman filter (EKF), iterated extended Kalman filter (IEKF), unscented Kalman filter (UKF) and iterated unscented Kalman filter (IUKF). In this paper two new nonlinear Kalman filters are proposed and investigated, namely the observation-centered extended Kalman filter (OCEKF) and observation-centered unscented Kalman filter (OCUKF). Although the UKF and EKF are common default choices for nonlinear filtering, there are situations where they are bad choices. Examples are given where the EKF and UKF perform very poorly, and the IEKF and OCEKF perform well. In addition the IUKF and OCUKF are generally similar to the IEKF and OCEKF, and also perform well, though care is needed in the choice of tuning parameters when the observation error is small. The reasons for this behaviour are explored in detail.
A new type of ensemble Kalman filter is developed, which is based on replacing the sample covariance in the analysis step by its diagonal in a spectral basis. It is proved that this technique improves the aproximation of the covariance when the covariance itself is diagonal in the spectral basis, as is the case, e.g., for a second-order stationary random field and the Fourier basis. The method is extended by wavelets to the case when the state variables are random fields, which are not spatially homogeneous. Efficient implementations by the fast Fourier transform (FFT) and discrete wavelet transform (DWT) are presented for several types of observations, including high-dimensional data given on a part of the domain, such as radar and satellite images. Computational experiments confirm that the method performs well on the Lorenz 96 problem and the shallow water equations with very small ensembles and over multiple analysis cycles.
We consider the robust filtering problem for a nonlinear state-space model with outliers in measurements. To improve the robustness of the traditional Kalman filtering algorithm, we propose in this work two robust filters based on mixture correntropy, especially the double-Gaussian mixture correntropy and Laplace-Gaussian mixture correntropy. We have formulated the robust filtering problem by adopting the mixture correntropy induced cost to replace the quadratic one in the conventional Kalman filter for measurement fitting errors. In addition, a tradeoff weight coefficient is introduced to make sure the proposed approaches can provide reasonable state estimates in scenarios where measurement fitting errors are small. The formulated robust filtering problems are iteratively solved by utilizing the cubature Kalman filtering framework with a reweighted measurement covariance. Numerical results show that the proposed methods can achieve a performance improvement over existing robust solutions.
Kalman Filters are one of the most influential models of time-varying phenomena. They admit an intuitive probabilistic interpretation, have a simple functional form, and enjoy widespread adoption in a variety of disciplines. Motivated by recent variational methods for learning deep generative models, we introduce a unified algorithm to efficiently learn a broad spectrum of Kalman filters. Of particular interest is the use of temporal generative models for counterfactual inference. We investigate the efficacy of such models for counterfactual inference, and to that end we introduce the Healing MNIST dataset where long-term structure, noise and actions are applied to sequences of digits. We show the efficacy of our method for modeling this dataset. We further show how our model can be used for counterfactual inference for patients, based on electronic health record data of 8,000 patients over 4.5 years.
Data assimilation is concerned with sequentially estimating a temporally-evolving state. This task, which arises in a wide range of scientific and engineering applications, is particularly challenging when the state is high-dimensional and the state-space dynamics are unknown. This paper introduces a machine learning framework for learning dynamical systems in data assimilation. Our auto-differentiable ensemble Kalman filters (AD-EnKFs) blend ensemble Kalman filters for state recovery with machine learning tools for learning the dynamics. In doing so, AD-EnKFs leverage the ability of ensemble Kalman filters to scale to high-dimensional states and the power of automatic differentiation to train high-dimensional surrogate models for the dynamics. Numerical results using the Lorenz-96 model show that AD-EnKFs outperform existing methods that use expectation-maximization or particle filters to merge data assimilation and machine learning. In addition, AD-EnKFs are easy to implement and require minimal tuning.
A data-driven method for improving the correlation estimation in serial ensemble Kalman filters is introduced. The method finds a linear map that transforms, at each assimilation cycle, the poorly estimated sample correlation into an improved correlation. This map is obtained from an offline training procedure without any tuning as the solution of a linear regression problem that uses appropriate sample correlation statistics obtained from historical data assimilation products. In an idealized OSSE with the Lorenz-96 model and for a range of cases of linear and nonlinear observation models, the proposed scheme improves the filter estimates, especially when the ensemble size is small relative to the dimension of the state space.