ﻻ يوجد ملخص باللغة العربية
This paper deals with the estimation of a probability measure on the real line from data observed with an additive noise. We are interested in rates of convergence for the Wasserstein metric of order $pgeq 1$. The distribution of the errors is assumed to be known and to belong to a class of supersmooth or ordinary smooth distributions. We obtain in the univariate situation an improved upper bound in the ordinary smooth case and less restrictive conditions for the existing bound in the supersmooth one. In the ordinary smooth case, a lower bound is also provided, and numerical experiments illustrating the rates of convergence are presented.
Wasserstein geometry and information geometry are two important structures to be introduced in a manifold of probability distributions. Wasserstein geometry is defined by using the transportation cost between two distributions, so it reflects the met
We consider the nonparametric estimation of the density function of weakly and strongly dependent processes with noisy observations. We show that in the ordinary smooth case the optimal bandwidth choice can be influenced by long range dependence, as
Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions, under which the actual rates of convergence of the kernel ridge regression estimator under both the L_2 norm and the n
We study high-dimensional linear models with error-in-variables. Such models are motivated by various applications in econometrics, finance and genetics. These models are challenging because of the need to account for measurement errors to avoid non-
The paper discusses the estimation of a continuous density function of the target random field $X_{bf{i}}$, $bf{i}in mathbb {Z}^N$ which is contaminated by measurement errors. In particular, the observed random field $Y_{bf{i}}$, $bf{i}in mathbb {Z}^