ترغب بنشر مسار تعليمي؟ اضغط هنا

Bi-$s^*$-Concave Distributions

124   0   0.0 ( 0 )
 نشر من قبل Jon A. Wellner
 تاريخ النشر 2020
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

We introduce new shape-constrained classes of distribution functions on R, the bi-$s^*$-concave classes. In parallel to results of Dumbgen, Kolesnyk, and Wilke (2017) for what they called the class of bi-log-concave distribution functions, we show that every $s$-concave density $f$ has a bi-$s^*$-concave distribution function $F$ for $s^*leq s/(s+1)$. Confidence bands building on existing nonparametric bands, but accounting for the shape constraint of bi-$s^*$-concavity, are also considered. The new bands extend those developed by Dumbgen et al. (2017) for the constraint of bi-log-concavity. We also make connections between bi-$s^*$-concavity and finiteness of the CsorgH{o} - Revesz constant of $F$ which plays an important role in the theory of quantile processes.

قيم البحث

اقرأ أيضاً

We introduce a new shape-constrained class of distribution functions on R, the bi-$s^*$-concave class. In parallel to results of Dumbgen, Kolesnyk, and Wilke (2017) for what they called the class of bi-log-concave distribution functions, we show that every s-concave density f has a bi-$s^*$-concave distribution function $F$ and that every bi-$s^*$-concave distribution function satisfies $gamma (F) le 1/(1+s)$ where finiteness of $$ gamma (F) equiv sup_{x} F(x) (1-F(x)) frac{| f (x)|}{f^2 (x)}, $$ the CsorgH{o} - Revesz constant of F, plays an important role in the theory of quantile processes on $R$.
Let ${P_{theta}:theta in {mathbb R}^d}$ be a log-concave location family with $P_{theta}(dx)=e^{-V(x-theta)}dx,$ where $V:{mathbb R}^dmapsto {mathbb R}$ is a known convex function and let $X_1,dots, X_n$ be i.i.d. r.v. sampled from distribution $P_{t heta}$ with an unknown location parameter $theta.$ The goal is to estimate the value $f(theta)$ of a smooth functional $f:{mathbb R}^dmapsto {mathbb R}$ based on observations $X_1,dots, X_n.$ In the case when $V$ is sufficiently smooth and $f$ is a functional from a ball in a Holder space $C^s,$ we develop estimators of $f(theta)$ with minimax optimal error rates measured by the $L_2({mathbb P}_{theta})$-distance as well as by more general Orlicz norm distances. Moreover, we show that if $dleq n^{alpha}$ and $s>frac{1}{1-alpha},$ then the resulting estimators are asymptotically efficient in Hajek-LeCam sense with the convergence rate $sqrt{n}.$ This generalizes earlier results on estimation of smooth functionals in Gaussian shift models. The estimators have the form $f_k(hat theta),$ where $hat theta$ is the maximum likelihood estimator and $f_k: {mathbb R}^dmapsto {mathbb R}$ (with $k$ depending on $s$) are functionals defined in terms of $f$ and designed to provide a higher order bias reduction in functional estimation problem. The method of bias reduction is based on iterative parametric bootstrap and it has been successfully used before in the case of Gaussian models.
We study the approximation of arbitrary distributions $P$ on $d$-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if $P$ has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on $P$ with respect to Mallows distance $D_1(cdot,cdot)$. This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response $Y=mu(X)+epsilon$, where $X$ and $epsilon$ are independent, $mu(cdot)$ belongs to a certain class of regression functions while $epsilon$ is a random error with log-concave density and mean zero.
122 - Tomohiro Nishiyama 2019
Log-concave distributions include some important distributions such as normal distribution, exponential distribution and so on. In this note, we show inequalities between two Lp-norms for log-concave distributions on the Euclidean space. These inequa lities are the generalizations of the upper and lower bound of the differential entropy and are also interpreted as a kind of expansion of the inequality between two Lp-norms on the measurable set with finite measure.
We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, that is, a density of the form $f_0=expvarphi_0$ where $varphi_0$ is a concave function on $mathbb{R}$. The pointwise limiting distributi ons depend on the second and third derivatives at 0 of $H_k$, the lower invelope of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of $varphi_0=log f_0$ at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode $M(f_0)$ and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا