No Arabic abstract
In this paper, we present a systematic stability analysis of the quadrature-based moment method (QBMM) for the one-dimensional Boltzmann equation with BGK or Shakhov models. As reported in recent literature, the method has revealed its potential for modeling non-equilibrium flows, while a thorough theoretical analysis is largely missing but desirable. We show that the method can yield non-hyperbolic moment systems if the distribution function is approximated by a linear combination of $delta$-functions. On the other hand, if the $delta$-functions are replaced by their Gaussian approximations with a common variance, we prove that the moment systems are strictly hyperbolic and preserve the dissipation property (or $H$-theorem) of the kinetic equation. In the proof we also determine the equilibrium manifold that lies on the boundary of the state space. The proofs are quite technical and involve detailed analyses of the characteristic polynomials of the coefficient matrices.
We present a data-driven approach to construct entropy-based closures for the moment system from kinetic equations. The proposed closure learns the entropy function by fitting the map between the moments and the entropy of the moment system, and thus does not depend on the space-time discretization of the moment system and specific problem configurations such as initial and boundary conditions. With convex and $C^2$ approximations, this data-driven closure inherits several structural properties from entropy-based closures, such as entropy dissipation, hyperbolicity, and H-Theorem. We construct convex approximations to the Maxwell-Boltzmann entropy using convex splines and neural networks, test them on the plane source benchmark problem for linear transport in slab geometry, and compare the results to the standard, optimization-based M$_N$ closures. Numerical results indicate that these data-driven closures provide accurate solutions in much less computation time than the M$_N$ closures.
Using deep neural networks to solve PDEs has attracted a lot of attentions recently. However, why the deep learning method works is falling far behind its empirical success. In this paper, we provide a rigorous numerical analysis on deep Ritz method (DRM) cite{Weinan2017The} for second order elliptic equations with Drichilet, Neumann and Robin boundary condition, respectively. We establish the first nonasymptotic convergence rate in $H^1$ norm for DRM using deep networks with smooth activation functions including logistic and hyperbolic tangent functions. Our results show how to set the hyper-parameter of depth and width to achieve the desired convergence rate in terms of number of training samples.
Often in applications ranging from medical imaging and sensor networks to error correction and data science (and beyond), one needs to solve large-scale linear systems in which a fraction of the measurements have been corrupted. We consider solving such large-scale systems of linear equations $mathbf{A}mathbf{x}=mathbf{b}$ that are inconsistent due to corruptions in the measurement vector $mathbf{b}$. We develop several variants of iterative methods that converge to the solution of the uncorrupted system of equations, even in the presence of large corruptions. These methods make use of a quantile of the absolute values of the residual vector in determining the iterate update. We present both theoretical and empirical results that demonstrate the promise of these iterative approaches.
In this paper we analyze the stability of equilibrium manifolds of hyperbolic shallow water moment equations. Shallow water moment equations describe shallow flows for complex velocity profiles which vary in vertical direction and the models can be seen as extensions of the standard shallow water equations. Equilibrium stability is an important property of balance laws that determines the linear stability of solutions in the vicinity of equilibrium manifolds and it is seen as a necessary condition for stable numerical solutions. After an analysis of the hyperbolic structure of the models, we identify three different stability manifolds based on three different limits of the right-hand side friction term, which physically correspond to water-at-rest, constant-velocity, and bottom-at-rest velocity profiles. The stability analysis then shows that the structural stability conditions are fulfilled for the water-at-rest equilibrium and the constant-velocity equilibrium. However, the bottom-at-rest equilibrium can lead to instable modes depending on the velocity profile. Relaxation towards the respective equilibrium manifolds is investigated numerically for different models.
This paper presents a convergence analysis of kernel-based quadrature rules in misspecified settings, focusing on deterministic quadrature in Sobolev spaces. In particular, we deal with misspecified settings where a test integrand is less smooth than a Sobolev RKHS based on which a quadrature rule is constructed. We provide convergence guarantees based on two different assumptions on a quadrature rule: one on quadrature weights, and the other on design points. More precisely, we show that convergence rates can be derived (i) if the sum of absolute weights remains constant (or does not increase quickly), or (ii) if the minimum distance between design points does not decrease very quickly. As a consequence of the latter result, we derive a rate of convergence for Bayesian quadrature in misspecified settings. We reveal a condition on design points to make Bayesian quadrature robust to misspecification, and show that, under this condition, it may adaptively achieve the optimal rate of convergence in the Sobolev space of a lesser order (i.e., of the unknown smoothness of a test integrand), under a slightly stronger regularity condition on the integrand.