ترغب بنشر مسار تعليمي؟ اضغط هنا

Inequalities between $L^p$-norms for log-concave distributions

123   0   0.0 ( 0 )
 نشر من قبل Tomohiro Nishiyama
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Log-concave distributions include some important distributions such as normal distribution, exponential distribution and so on. In this note, we show inequalities between two Lp-norms for log-concave distributions on the Euclidean space. These inequalities are the generalizations of the upper and lower bound of the differential entropy and are also interpreted as a kind of expansion of the inequality between two Lp-norms on the measurable set with finite measure.

قيم البحث

اقرأ أيضاً

We establish concentration inequalities in the class of ultra log-concave distributions. In particular, we show that ultra log-concave distributions satisfy Poisson concentration bounds. As an application, we derive concentration bounds for the intri nsic volumes of a convex body, which generalizes and improves a result of Lotz, McCoy, Nourdin, Peccati, and Tropp (2019).
We study the approximation of arbitrary distributions $P$ on $d$-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback--Leibler-type functional. We show that such an approximation exists if and only if $P$ has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on $P$ with respect to Mallows distance $D_1(cdot,cdot)$. This result implies consistency of the maximum likelihood estimator of a log-concave density under fairly general conditions. It also allows us to prove existence and consistency of estimators in regression models with a response $Y=mu(X)+epsilon$, where $X$ and $epsilon$ are independent, $mu(cdot)$ belongs to a certain class of regression functions while $epsilon$ is a random error with log-concave density and mean zero.
We present a new approach for inference about a log-concave distribution: Instead of using the method of maximum likelihood, we propose to incorporate the log-concavity constraint in an appropriate nonparametric confidence set for the cdf $F$. This a pproach has the advantage that it automatically provides a measure of statistical uncertainty and it thus overcomes a marked limitation of the maximum likelihood estimate. In particular, we show how to construct confidence bands for the density that have a finite sample guaranteed confidence level. The nonparametric confidence set for $F$ which we introduce here has attractive computational and statistical properties: It allows to bring modern tools from optimization to bear on this problem via difference of convex programming, and it results in optimal statistical inference. We show that the width of the resulting confidence bands converges at nearly the parametric $n^{-frac{1}{2}}$ rate when the log density is $k$-affine.
Let $C$ and $K$ be centrally symmetric convex bodies of volume $1$ in ${mathbb R}^n$. We provide upper bounds for the multi-integral expression begin{equation*}|{bf t}|_{C^s,K}=int_{C}cdotsint_{C}Big|sum_{j=1}^st_jx_jBig|_K,dx_1cdots dx_send{equation *} in the case where $C$ is isotropic. Our approach provides an alternative proof of the sharp lower bound, due to Gluskin and V. Milman, for this quantity. We also present some applications to randomized vector balancing problems.
We investigate various geometric and functional inequalities for the class of log-concave probability sequences. We prove dilation inequalities for log-concave probability measures on the integers. A functional analog of this geometric inequality is derived, giving large and small deviation inequalities from a median, in terms of a modulus of regularity. Our methods are of independent interest, we find that log-affine sequences are the extreme points of the set of log-concave sequences belonging to a half-space slice of the simplex. This amounts to a discrete analog of the localization lemma of Lovasz and Simonovits. Further applications of this lemma are used to produce a discrete version of the Prekopa-Leindler inequality, large deviation inequalities for log-concave measures about their mean, and provide insight on the stability of generalized log-concavity under convolution.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا