ترغب بنشر مسار تعليمي؟ اضغط هنا

Local continuity of log-concave projection, with applications to estimation under model misspecification

121   0   0.0 ( 0 )
 نشر من قبل Rina Barber
 تاريخ النشر 2020
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

The log-concave projection is an operator that maps a d-dimensional distribution P to an approximating log-concave density. Prior work by D{u}mbgen et al. (2011) establishes that, with suitable metrics on the underlying spaces, this projection is continuous, but not uniformly continuous. In this work we prove a local uniform continuity result for log-concave projection -- in particular, establishing that this map is locally H{o}lder-(1/4) continuous. A matching lower bound verifies that this exponent cannot be improved. We also examine the implications of this continuity result for the empirical setting -- given a sample drawn from a distribution P, we bound the squared Hellinger distance between the log-concave projection of the empirical distribution of the sample, and the log-concave projection of P. In particular, this yields interesting statistical results for the misspecified setting, where P is not itself log-concave.



قيم البحث

اقرأ أيضاً

Let ${P_{theta}:theta in {mathbb R}^d}$ be a log-concave location family with $P_{theta}(dx)=e^{-V(x-theta)}dx,$ where $V:{mathbb R}^dmapsto {mathbb R}$ is a known convex function and let $X_1,dots, X_n$ be i.i.d. r.v. sampled from distribution $P_{t heta}$ with an unknown location parameter $theta.$ The goal is to estimate the value $f(theta)$ of a smooth functional $f:{mathbb R}^dmapsto {mathbb R}$ based on observations $X_1,dots, X_n.$ In the case when $V$ is sufficiently smooth and $f$ is a functional from a ball in a Holder space $C^s,$ we develop estimators of $f(theta)$ with minimax optimal error rates measured by the $L_2({mathbb P}_{theta})$-distance as well as by more general Orlicz norm distances. Moreover, we show that if $dleq n^{alpha}$ and $s>frac{1}{1-alpha},$ then the resulting estimators are asymptotically efficient in Hajek-LeCam sense with the convergence rate $sqrt{n}.$ This generalizes earlier results on estimation of smooth functionals in Gaussian shift models. The estimators have the form $f_k(hat theta),$ where $hat theta$ is the maximum likelihood estimator and $f_k: {mathbb R}^dmapsto {mathbb R}$ (with $k$ depending on $s$) are functionals defined in terms of $f$ and designed to provide a higher order bias reduction in functional estimation problem. The method of bias reduction is based on iterative parametric bootstrap and it has been successfully used before in the case of Gaussian models.
We study the approximation of arbitrary distributions $P$ on $d$-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if $P$ has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on $P$ with respect to Mallows distance $D_1(cdot,cdot)$. This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response $Y=mu(X)+epsilon$, where $X$ and $epsilon$ are independent, $mu(cdot)$ belongs to a certain class of regression functions while $epsilon$ is a random error with log-concave density and mean zero.
We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, that is, a density of the form $f_0=expvarphi_0$ where $varphi_0$ is a concave function on $mathbb{R}$. The pointwise limiting distributi ons depend on the second and third derivatives at 0 of $H_k$, the lower invelope of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of $varphi_0=log f_0$ at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode $M(f_0)$ and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.
We introduce and study a local linear nonparametric regression estimator for censorship model. The main goal of this paper is, to establish the uniform almost sure consistency result with rate over a compact set for the new estimate. To support our t heoretical result, a simulation study has been done to make comparison with the classical regression estimator.
In this work we study the estimation of the density of a totally positive random vector. Total positivity of the distribution of a random vector implies a strong form of positive dependence between its coordinates and, in particular, it implies posit ive association. Since estimating a totally positive density is a non-parametric problem, we take on a (modified) kernel density estimation approach. Our main result is that the sum of scaled standard Gaussian bumps centered at a min-max closed set provably yields a totally positive distribution. Hence, our strategy for producing a totally positive estimator is to form the min-max closure of the set of samples, and output a sum of Gaussian bumps centered at the points in this set. We can frame this sum as a convolution between the uniform distribution on a min-max closed set and a scaled standard Gaussian. We further conjecture that convolving any totally positive density with a standard Gaussian remains totally positive.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا