No Arabic abstract
We introduce a new approximate multiresolution analysis (MRA) using a single Gaussian as the scaling function, which we call Gaussian MRA (GMRA). As an initial application, we employ this new tool to accurately and efficiently compute the probability density function (PDF) of the product of independent random variables. In contrast with Monte-Carlo (MC) type methods (the only other universal approach known to address this problem), our method not only achieves accuracies beyond the reach of MC but also produces a PDF expressed as a Gaussian mixture, thus allowing for further efficient computations. We also show that an exact MRA corresponding to our GMRA can be constructed for a matching user-selected accuracy.
We introduce a new functional representation of probability density functions (PDFs) of non-negative random variables via a product of a monomial factor and linear combinations of decaying exponentials with complex exponents. This approximate representation of PDFs is obtained for any finite, user-selected accuracy. Using a fast algorithm involving Hankel matrices, we develop a general numerical method for computing the PDF of the sums, products, or quotients of any number of non-negative random variables yielding the result in the same type of functional representation. We present several examples to demonstrate the accuracy of the approach.
We study fractional smoothness of measures on $mathbb{R}^k$, that are images of a Gaussian measure under mappings from Gaussian Sobolev classes. As a consequence we obtain Nikolskii--Besov fractional regularity of these distributions under some weak nondegeneracy assumption.
We study the regularity of densities of distributions that are polynomial images of the standard Gaussian measure on $mathbb{R}^n$. We assume that the degree of a polynomial is fixed and that each variable enters to a power bounded by another fixed number.
Hierarchical and k-medoids clustering are deterministic clustering algorithms based on pairwise distances. Using these same pairwise distances, we propose a novel stochastic clustering method based on random partition distributions. We call our method CaviarPD, for cluster analysis via random partition distributions. CaviarPD first samples clusterings from a random partition distribution and then finds the best cluster estimate based on these samples using algorithms to minimize an expected loss. We compare CaviarPD with hierarchical and k-medoids clustering through eight case studies. Cluster estimates based on our method are competitive with those of hierarchical and k-medoids clustering. They also do not require the subjective choice of the linkage method necessary for hierarchical clustering. Furthermore, our distribution-based procedure provides an intuitive graphical representation to assess clustering uncertainty.
We give a convergence proof for the approximation by sparse collocation of Hilbert-space-valued functions depending on countably many Gaussian random variables. Such functions appear as solutions of elliptic PDEs with lognormal diffusion coefficients. We outline a general $L^2$-convergence theory based on previous work by Bachmayr et al. (2016) and Chen (2016) and establish an algebraic convergence rate for sufficiently smooth functions assuming a mild growth bound for the univariate hierarchical surpluses of the interpolation scheme applied to Hermite polynomials. We verify specifically for Gauss-Hermite nodes that this assumption holds and also show algebraic convergence w.r.t. the resulting number of sparse grid points for this case. Numerical experiments illustrate the dimension-independent convergence rate.