ترغب بنشر مسار تعليمي؟ اضغط هنا

Skewness correction in tail probability approximations for sums of local statistics

56   0   0.0 ( 0 )
 نشر من قبل Xiao Fang
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Correcting for skewness can result in more accurate tail probability approximations in the central limit theorem for sums of independent random variables. In this paper, we extend the theory to sums of local statistics of independent random variables and apply the result to $k$-runs, U-statistics, and subgraph counts in the Erdos-Renyi random graph. To prove our main result, we develop exponential concentration inequalities and higher-order Cramer-type moderate deviations via Steins method.



قيم البحث

اقرأ أيضاً

209 - Aihua Xia , J. E. Yukich 2014
This paper concerns the asymptotic behavior of a random variable $W_lambda$ resulting from the summation of the functionals of a Gibbsian spatial point process over windows $Q_lambda uparrow R^d$. We establish conditions ensuring that $W_lambda$ has volume order fluctuations, that is they coincide with the fluctuations of functionals of Poisson spatial point processes. We combine this result with Steins method to deduce rates of normal approximation for $W_lambda$, as $lambdatoinfty$. Our general results establish variance asymptotics and central limit theorems for statistics of random geometric and related Euclidean graphs on Gibbsian input. We also establish similar limit theory for claim sizes of insurance models with Gibbsian input, the number of maximal points of a Gibbsian sample, and the size of spatial birth-growth models with Gibbsian input.
We employ stabilization methods and second order Poincare inequalities to establish rates of multivariate normal convergence for a large class of vectors $(H_s^{(1)},...,H_s^{(m)})$, $s geq 1$, of statistics of marked Poisson processes on $mathbb{R}^ d$, $d geq 2$, as the intensity parameter $s$ tends to infinity. Our results are applicable whenever the constituent functionals $H_s^{(i)}$, $iin{1,...,m}$, are expressible as sums of exponentially stabilizing score functions satisfying a moment condition. The rates are for the $d_2$-, $d_3$-, and $d_{convex}$-distances. When we compare with a centered Gaussian random vector, whose covariance matrix is given by the asymptotic covariances, the rates are in general unimprovable and are governed by the rate of convergence of $s^{-1} {rm Cov}( H_s^{(i)}, H_s^{(j)})$, $i,jin{1,...,m}$, to the limiting covariance, shown to be of order $s^{-1/d}$. We use the general results to deduce rates of multivariate normal convergence for statistics arising in random graphs and topological data analysis as well as for multivariate statistics used to test equality of distributions. Some of our results hold for stabilizing functionals of Poisson input on suitable metric spaces.
This paper concerns the approximation of probability measures on $mathbf{R}^d$ with respect to the Kullback-Leibler divergence. Given an admissible target measure, we show the existence of the best approximation, with respect to this divergence, from certain sets of Gaussian measures and Gaussian mixtures. The asymptotic behavior of such best approximations is then studied in the small parameter limit where the measure concentrates; this asymptotic behaviour is characterized using $Gamma$-convergence. The theory developed is then applied to understanding the frequentist consistency of Bayesian inverse problems. For a fixed realization of noise, we show the asymptotic normality of the posterior measure in the small noise limit. Taking into account the randomness of the noise, we prove a Bernstein-Von Mises type result for the posterior measure.
157 - Luc Devroye , Gabor Lugosi 2007
It is shown that functions defined on ${0,1,...,r-1}^n$ satisfying certain conditions of bounded differences that guarantee sub-Gaussian tail behavior also satisfy a much stronger ``local sub-Gaussian property. For self-bounding and configuration fun ctions we derive analogous locally subexponential behavior. The key tool is Talagrands [Ann. Probab. 22 (1994) 1576--1587] variance inequality for functions defined on the binary hypercube which we extend to functions of uniformly distributed random variables defined on ${0,1,...,r-1}^n$ for $rge2$.
Based on a discrete version of the Pollaczeck-Khinchine formula, a general method to calculate the ultimate ruin probability in the Gerber-Dickson risk model is provided when claims follow a negative binomial mixture distribution. The result is then extended for claims with a mixed Poisson distribution. The formula obtained allows for some approximation procedures. Several examples are provided along with the numerical evidence of the accuracy of the approximations.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا