Do you want to publish a course? Click here

Wasserstein Statistics in One-dimensional Location-Scale Model

59   0   0.0 ( 0 )
 Added by Takeru Matsuda
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Wasserstein geometry and information geometry are two important structures to be introduced in a manifold of probability distributions. Wasserstein geometry is defined by using the transportation cost between two distributions, so it reflects the metric of the base manifold on which the distributions are defined. Information geometry is defined to be invariant under reversible transformations of the base space. Both have their own merits for applications. In particular, statistical inference is based upon information geometry, where the Fisher metric plays a fundamental role, whereas Wasserstein geometry is useful in computer vision and AI applications. In this study, we analyze statistical inference based on the Wasserstein geometry in the case that the base space is one-dimensional. By using the location-scale model, we further derive the W-estimator that explicitly minimizes the transportation cost from the empirical distribution to a statistical model and study its asymptotic behaviors. We show that the W-estimator is consistent and explicitly give its asymptotic distribution by using the functional delta method. The W-estimator is Fisher efficient in the Gaussian case.



rate research

Read More

112 - Jer^ome Dedecker 2014
This paper deals with the estimation of a probability measure on the real line from data observed with an additive noise. We are interested in rates of convergence for the Wasserstein metric of order $pgeq 1$. The distribution of the errors is assumed to be known and to belong to a class of supersmooth or ordinary smooth distributions. We obtain in the univariate situation an improved upper bound in the ordinary smooth case and less restrictive conditions for the existing bound in the supersmooth one. In the ordinary smooth case, a lower bound is also provided, and numerical experiments illustrating the rates of convergence are presented.
Let ${P_{theta}:theta in {mathbb R}^d}$ be a log-concave location family with $P_{theta}(dx)=e^{-V(x-theta)}dx,$ where $V:{mathbb R}^dmapsto {mathbb R}$ is a known convex function and let $X_1,dots, X_n$ be i.i.d. r.v. sampled from distribution $P_{theta}$ with an unknown location parameter $theta.$ The goal is to estimate the value $f(theta)$ of a smooth functional $f:{mathbb R}^dmapsto {mathbb R}$ based on observations $X_1,dots, X_n.$ In the case when $V$ is sufficiently smooth and $f$ is a functional from a ball in a Holder space $C^s,$ we develop estimators of $f(theta)$ with minimax optimal error rates measured by the $L_2({mathbb P}_{theta})$-distance as well as by more general Orlicz norm distances. Moreover, we show that if $dleq n^{alpha}$ and $s>frac{1}{1-alpha},$ then the resulting estimators are asymptotically efficient in Hajek-LeCam sense with the convergence rate $sqrt{n}.$ This generalizes earlier results on estimation of smooth functionals in Gaussian shift models. The estimators have the form $f_k(hat theta),$ where $hat theta$ is the maximum likelihood estimator and $f_k: {mathbb R}^dmapsto {mathbb R}$ (with $k$ depending on $s$) are functionals defined in terms of $f$ and designed to provide a higher order bias reduction in functional estimation problem. The method of bias reduction is based on iterative parametric bootstrap and it has been successfully used before in the case of Gaussian models.
In this paper, we use the class of Wasserstein metrics to study asymptotic properties of posterior distributions. Our first goal is to provide sufficient conditions for posterior consistency. In addition to the well-known Schwartzs Kullback--Leibler condition on the prior, the true distribution and most probability measures in the support of the prior are required to possess moments up to an order which is determined by the order of the Wasserstein metric. We further investigate convergence rates of the posterior distributions for which we need stronger moment conditions. The required tail conditions are sharp in the sense that the posterior distribution may be inconsistent or contract slowly to the true distribution without these conditions. Our study involves techniques that build on recent advances on Wasserstein convergence of empirical measures. We apply the results to density estimation with a Dirichlet process mixture prior and conduct a simulation study for further illustration.
Sensitivity indices are commonly used to quantity the relative inuence of any specic group of input variables on the output of a computer code. In this paper, we focus both on computer codes the output of which is a cumulative distribution function and on stochastic computer codes. We propose a way to perform a global sensitivity analysis for these kinds of computer codes. In the rst setting, we dene two indices: the rst one is based on Wasserstein Fr{e}chet means while the second one is based on the Hoeding decomposition of the indicators of Wasserstein balls. Further, when dealing with the stochastic computer codes, we dene an ideal version of the stochastic computer code thats ts into the frame of the rst setting. Finally, we deduce a procedure to realize a second level global sensitivity analysis, namely when one is interested in the sensitivity related to the input distributions rather than in the sensitivity related to the inputs themselves. Several numerical studies are proposed as illustrations in the dierent settings.
We investigate predictive density estimation under the $L^2$ Wasserstein loss for location families and location-scale families. We show that plug-in densities form a complete class and that the Bayesian predictive density is given by the plug-in density with the posterior mean of the location and scale parameters. We provide Bayesian predictive densities that dominate the best equivariant one in normal models.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا