ﻻ يوجد ملخص باللغة العربية
We prove the support recovery for a general class of linear and nonlinear evolutionary partial differential equation (PDE) identification from a single noisy trajectory using $ell_1$ regularized Pseudo-Least Squares model~($ell_1$-PsLS). In any associative $mathbb{R}$-algebra generated by finitely many differentiation operators that contain the unknown PDE operator, applying $ell_1$-PsLS to a given data set yields a family of candidate models with coefficients $mathbf{c}(lambda)$ parameterized by the regularization weight $lambdageq 0$. The trace of ${mathbf{c}(lambda)}_{lambdageq 0}$ suffers from high variance due to data noises and finite difference approximation errors. We provide a set of sufficient conditions which guarantee that, from a single trajectory data denoised by a Local-Polynomial filter, the support of $mathbf{c}(lambda)$ asymptotically converges to the true signed-support associated with the underlying PDE for sufficiently many data and a certain range of $lambda$. We also show various numerical experiments to validate our theory.
A recent result of Freeman, Odell, Sari, and Zheng states that whenever a separable Banach space not containing $ell_1$ has the property that all asymptotic models generated by weakly null sequences are equivalent to the unit vector basis of $c_0$ th
The simulation of long, nonlinear dispersive waves in bounded domains usually requires the use of slip-wall boundary conditions. Boussinesq systems appearing in the literature are generally not well-posed when such boundary conditions are imposed, or
The increasing availability of data presents an opportunity to calibrate unknown parameters which appear in complex models of phenomena in the biomedical, physical and social sciences. However, model complexity often leads to parameter-to-data maps w
Thin surfaces, such as the leaves of a plant, pose a significant challenge for implicit surface reconstruction techniques, which typically assume a closed, orientable surface. We show that by approximately interpolating a point cloud of the surface (
Sparsity-inducing regularization problems are ubiquitous in machine learning applications, ranging from feature selection to model compression. In this paper, we present a novel stochastic method -- Orthant Based Proximal Stochastic Gradient Method (