ﻻ يوجد ملخص باللغة العربية
We consider the problem of bounding large deviations for non-i.i.d. random variables that are allowed to have arbitrary dependencies. Previous works typically assumed a specific dependence structure, namely the existence of independent components. Bounds that depend on the degree of dependence between the observations have only been studied in the theory of mixing processes, where variables are time-ordered. Here, we introduce a new way of measuring dependences within an unordered set of variables. We prove concentration inequalities, that apply to any set of random variables, but benefit from the presence of weak dependencies. We also discuss applications and extensions of our results to related problems of machine learning and large deviations.
In this paper, we obtain some results on precise large deviations for non-random and random sums of widely dependent random variables with common dominatedly varying tail distribution or consistently varying tail distribution on $(-infty,infty)$. The
Given ${X_k}$ is a martingale difference sequence. And given another ${Y_k}$ which has dependency within the sequence. Assume ${X_k}$ is independent with ${Y_k}$, we study the properties of the sums of product of two sequences $sum_{k=1}^{n} X_k Y_k$
We provide a sharp lower bound on the $p$-norm of a sum of independent uniform random variables in terms of its variance when $0 < p < 1$. We address an analogous question for $p$-Renyi entropy for $p$ in the same range.
Let {(X_i,Y_i)}_{i=1}^n be a sequence of independent bivariate random vectors. In this paper, we establish a refined Cramer type moderate deviation theorem for the general self-normalized sum sum_{i=1}^n X_i/(sum_{i=1}^n Y_i^2)^{1/2}, which unifies a
We compute the best constant in the Khintchine inequality under assumption that the sum of Rademacher random variables is zero.