No Arabic abstract
We propose a general --- i.e., independent of the underlying equation --- registration method for parameterized Model Order Reduction. Given the spatial domain $Omega subset mathbb{R}^d$ and a set of snapshots ${ u^k }_{k=1}^{n_{rm train}}$ over $Omega$ associated with $n_{rm train}$ values of the model parameters $mu^1,ldots, mu^{n_{rm train}} in mathcal{P}$, the algorithm returns a parameter-dependent bijective mapping $boldsymbol{Phi}: Omega times mathcal{P} to mathbb{R}^d$: the mapping is designed to make the mapped manifold ${ u_{mu} circ boldsymbol{Phi}_{mu}: , mu in mathcal{P} }$ more suited for linear compression methods. We apply the registration procedure, in combination with a linear compression method, to devise low-dimensional representations of solution manifolds with slowly-decaying Kolmogorov $N$-widths; we also consider the application to problems in parameterized geometries. We present a theoretical result to show the mathematical rigor of the registration procedure. We further present numerical results for several two-dimensional problems, to empirically demonstrate the effectivity of our proposal.
The Kolmogorov $n$-width of the solution manifolds of transport-dominated problems can decay slowly. As a result, it can be challenging to design efficient and accurate reduced order models (ROMs) for such problems. To address this issue, we propose a new learning-based projection method to construct nonlinear adaptive ROMs for transport problems. The construction follows the offline-online decomposition. In the offline stage, we train a neural network to construct adaptive reduced basis dependent on time and model parameters. In the online stage, we project the solution to the learned reduced manifold. Inheriting the merits from both deep learning and the projection method, the proposed method is more efficient than the conventional linear projection-based methods, and may reduce the generalization error of a solely learning-based ROM. Unlike some learning-based projection methods, the proposed method does not need to take derivatives of the neural network in the online stage.
We present a general -- i.e., independent of the underlying equation -- registration procedure for parameterized model order reduction. Given the spatial domain $Omega subset mathbb{R}^2$ and the manifold $mathcal{M}= { u_{mu} : mu in mathcal{P} }$ associated with the parameter domain $mathcal{P} subset mathbb{R}^P$ and the parametric field $mu mapsto u_{mu} in L^2(Omega)$, our approach takes as input a set of snapshots ${ u^k }_{k=1}^{n_{rm train}} subset mathcal{M}$ and returns a parameter-dependent bijective mapping ${Phi}: Omega times mathcal{P} to mathbb{R}^2$: the mapping is designed to make the mapped manifold ${ u_{mu} circ {Phi}_{mu}: , mu in mathcal{P} }$ more amenable for linear compression methods. In this work, we extend and further analyze the registration approach proposed in [Taddei, SISC, 2020]. The contributions of the present work are twofold. First, we extend the approach to deal with annular domains by introducing a suitable transformation of the coordinate system. Second, we discuss the extension to general two-dimensional geometries: towards this end, we introduce a spectral element approximation, which relies on a partition ${ Omega_{q} }_{q=1} ^{N_{rm dd}}$ of the domain $Omega$ such that $Omega_1,ldots,Omega_{N_{rm dd}}$ are isomorphic to the unit square. We further show that our spectral element approximation can cope with parameterized geometries. We present rigorous mathematical analysis to justify our proposal; furthermore, we present numerical results for a heat-transfer problem in an annular domain, a potential flow past a rotating symmetric airfoil, and an inviscid transonic compressible flow past a non-symmetric airfoil, to demonstrate the effectiveness of our method.
This work presents the windowed space-time least-squares Petrov-Galerkin method (WST-LSPG) for model reduction of nonlinear parameterized dynamical systems. WST-LSPG is a generalization of the space-time least-squares Petrov-Galerkin method (ST-LSPG). The main drawback of ST-LSPG is that it requires solving a dense space-time system with a space-time basis that is calculated over the entire global time domain, which can be unfeasible for large-scale applications. Instead of using a temporally-global space-time trial subspace and minimizing the discrete-in-time full-order model (FOM) residual over an entire time domain, the proposed WST-LSPG approach addresses this weakness by (1) dividing the time simulation into time windows, (2) devising a unique low-dimensional space-time trial subspace for each window, and (3) minimizing the discrete-in-time space-time residual of the dynamical system over each window. This formulation yields a problem with coupling confined within each window, but sequential across the windows. To enable high-fidelity trial subspaces characterized by a relatively minimal number of basis vectors, this work proposes constructing space-time bases using tensor decompositions for each window. WST-LSPG is equipped with hyper-reduction techniques to further reduce the computational cost. Numerical experiments for the one-dimensional Burgers equation and the two-dimensional compressible Navier-Stokes equations for flow over a NACA 0012 airfoil demonstrate that WST-LSPG is superior to ST-LSPG in terms of accuracy and computational gain.
We propose a nonlinear registration-based model reduction procedure for rapid and reliable solution of parameterized two-dimensional steady conservation laws. This class of problems is challenging for model reduction techniques due to the presence of nonlinear terms in the equations and also due to the presence of parameter-dependent discontinuities that cannot be adequately represented through linear approximation spaces. Our approach builds on a general (i.e., independent of the underlying equation) registration procedure for the computation of a mapping $Phi$ that tracks moving features of the solution field and on an hyper-reduced least-squares Petrov-Galerkin reduced-order model for the rapid and reliable computation of the solution coefficients. The contributions of this work are twofold. First, we investigate the application of registration-based methods to two-dimensional hyperbolic systems. Second, we propose a multi-fidelity approach to reduce the offline costs associated with the construction of the parameterized mapping and the reduced-order model. We discuss the application to an inviscid supersonic flow past a parameterized bump, to illustrate the many features of our method and to demonstrate its effectiveness.
Estimating parameters of Partial Differential Equations (PDEs) is of interest in a number of applications such as geophysical and medical imaging. Parameter estimation is commonly phrased as a PDE-constrained optimization problem that can be solved iteratively using gradient-based optimization. A computational bottleneck in such approaches is that the underlying PDEs needs to be solved numerous times before the model is reconstructed with sufficient accuracy. One way to reduce this computational burden is by using Model Order Reduction (MOR) techniques such as the Multiscale Finite Volume Method (MSFV). In this paper, we apply MSFV for solving high-dimensional parameter estimation problems. Given a finite volume discretization of the PDE on a fine mesh, the MSFV method reduces the problem size by computing a parameter-dependent projection onto a nested coarse mesh. A novelty in our work is the integration of MSFV into a PDE-constrained optimization framework, which updates the reduced space in each iteration. We also present a computationally tractable way of differentiating the MOR solution that acknowledges the change of basis. As we demonstrate in our numerical experiments, our method leads to computational savings particularly for large-scale parameter estimation problems and can benefit from parallelization.