No Arabic abstract
We present a novel approach aimed at high-performance uncertainty quantification for time-dependent problems governed by partial differential equations. In particular, we consider input uncertainties described by a Karhunen-Loeeve expansion and compute statistics of high-dimensional quantities-of-interest, such as the cardiac activation potential. Our methodology relies on a close integration of multilevel Monte Carlo methods, parallel iterative solvers, and a space-time discretization. This combination allows for space-time adaptivity, time-changing domains, and to take advantage of past samples to initialize the space-time solution. The resulting sequence of problems is distributed using a multilevel parallelization strategy, allocating batches of samples having different sizes to a different number of processors. We assess the performance of the proposed framework by showing in detail its application to the solution of nonlinear equations arising from cardiac electrophysiology. Specifically, we study the effect of spatially-correlated perturbations of the heart fibers conductivities on the mean and variance of the resulting activation map. As shown by the experiments, the theoretical rates of convergence of multilevel Monte Carlo are achieved. Moreover, the total computational work for a prescribed accuracy is reduced by an order of magnitude with respect to standard Monte Carlo methods.
We present a novel approach which aims at high-performance uncertainty quantification for cardiac electrophysiology simulations. Employing the monodomain equation to model the transmembrane potential inside the cardiac cells, we evaluate the effect of spatially correlated perturbations of the heart fibers on the statistics of the resulting quantities of interest. Our methodology relies on a close integration of multilevel quadrature methods, parallel iterative solvers and space-time finite element discretizations, allowing for a fully parallelized framework in space, time and stochastics. Extensive numerical studies are presented to evaluate convergence rates and to compare the performance of classical Monte Carlo methods such as standard Monte Carlo (MC) and quasi-Monte Carlo (QMC), as well as multilevel strategies, i.e. multilevel Monte Carlo (MLMC) and multilevel quasi-Monte Carlo (MLQMC) on hierarchies of nested meshes. Finally, we employ a recently suggested variant of the multilevel approach for non-nested meshes to deal with a realistic heart geometry.
This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.
In this work we develop a new hierarchical multilevel approach to generate Gaussian random field realizations in an algorithmically scalable manner that is well-suited to incorporate into multilevel Markov chain Monte Carlo (MCMC) algorithms. This approach builds off of other partial differential equation (PDE) approaches for generating Gaussian random field realizations; in particular, a single field realization may be formed by solving a reaction-diffusion PDE with a spatial white noise source function as the righthand side. While these approaches have been explored to accelerate forward uncertainty quantification tasks, e.g. multilevel Monte Carlo, the previous constructions are not directly applicable to multilevel MCMC frameworks which build fine scale random fields in a hierarchical fashion from coarse scale random fields. Our new hierarchical multilevel method relies on a hierarchical decomposition of the white noise source function in $L^2$ which allows us to form Gaussian random field realizations across multiple levels of discretization in a way that fits into multilevel MCMC algorithmic frameworks. After presenting our main theoretical results and numerical scaling results to showcase the utility of this new hierarchical PDE method for generating Gaussian random field realizations, this method is tested on a four-level MCMC algorithm to explore its feasibility.
We propose a novel $hp$-multilevel Monte Carlo method for the quantification of uncertainties in the compressible Navier-Stokes equations, using the Discontinuous Galerkin method as deterministic solver. The multilevel approach exploits hierarchies of uniformly refined meshes while simultaneously increasing the polynomial degree of the ansatz space. It allows for a very large range of resolutions in the physical space and thus an efficient decrease of the statistical error. We prove that the overall complexity of the $hp$-multilevel Monte Carlo method to compute the mean field with prescribed accuracy is, in best-case, of quadratic order with respect to the accuracy. We also propose a novel and simple approach to estimate a lower confidence bound for the optimal number of samples per level, which helps to prevent overestimating these quantities. The method is in particular designed for application on queue-based computing systems, where it is desirable to compute a large number of samples during one iteration, without overestimating the optimal number of samples. Our theoretical results are verified by numerical experiments for the two-dimensional compressible Navier-Stokes equations. In particular we consider a cavity flow problem from computational acoustics, demonstrating that the method is suitable to handle complex engineering problems.
Monte Carlo algorithms have a growing impact on nuclear medicine reconstruction processes. One of the main limitations of myocardial perfusion imaging (MPI) is the effective mitigation of the scattering component, which is particularly challenging in Single Photon Emission Computed Tomography (SPECT). In SPECT, no timing information can be retrieved to locate the primary source photons. Monte Carlo methods allow an event-by-event simulation of the scattering kinematics, which can be incorporated into a model of the imaging system response. This approach was adopted since the late Nineties by several authors, and recently took advantage of the increased computational power made available by high-performance CPUs and GPUs. These recent developments enable a fast image reconstruction with an improved image quality, compared to deterministic approaches. Deterministic approaches are based on energy-windowing of the detector response, and on the cumulative estimate and subtraction of the scattering component. In this paper, we review the main strategies and algorithms to correct for the scattering effect in SPECT and focus on Monte Carlo developments, which nowadays allow the three-dimensional reconstruction of SPECT cardiac images in a few seconds.