Do you want to publish a course? Click here

Dependency-dependent Bounds for Sums of Dependent Random Variables

110   0   0.0 ( 0 )
 Added by Alexander Zimin
 Publication date 2018
  fields
and research's language is English




Ask ChatGPT about the research

We consider the problem of bounding large deviations for non-i.i.d. random variables that are allowed to have arbitrary dependencies. Previous works typically assumed a specific dependence structure, namely the existence of independent components. Bounds that depend on the degree of dependence between the observations have only been studied in the theory of mixing processes, where variables are time-ordered. Here, we introduce a new way of measuring dependences within an unordered set of variables. We prove concentration inequalities, that apply to any set of random variables, but benefit from the presence of weak dependencies. We also discuss applications and extensions of our results to related problems of machine learning and large deviations.



rate research

Read More

150 - Zhaolei Cui , Yuebao Wang 2021
In this paper, we obtain some results on precise large deviations for non-random and random sums of widely dependent random variables with common dominatedly varying tail distribution or consistently varying tail distribution on $(-infty,infty)$. Then we apply the results to reinsurance and insurance and give some asymptotic estimates on proportional reinsurance, random-time ruin probability and the finite-time ruin probability.
Given ${X_k}$ is a martingale difference sequence. And given another ${Y_k}$ which has dependency within the sequence. Assume ${X_k}$ is independent with ${Y_k}$, we study the properties of the sums of product of two sequences $sum_{k=1}^{n} X_k Y_k$. We obtain product-CLT, a modification of classical central limit theorem, which can be useful in the study of random projections. We also obtain the rate of convergence which is similar to the Berry-Essen theorem in the classical CLT.
We provide a sharp lower bound on the $p$-norm of a sum of independent uniform random variables in terms of its variance when $0 < p < 1$. We address an analogous question for $p$-Renyi entropy for $p$ in the same range.
Let {(X_i,Y_i)}_{i=1}^n be a sequence of independent bivariate random vectors. In this paper, we establish a refined Cramer type moderate deviation theorem for the general self-normalized sum sum_{i=1}^n X_i/(sum_{i=1}^n Y_i^2)^{1/2}, which unifies and extends the classical Cramer (1938) theorem and the self-normalized Cramer type moderate deviation theorems by Jing, Shao and Wang (2003) as well as the further refined version by Wang (2011). The advantage of our result is evidenced through successful applications to weakly dependent random variables and self-normalized winsorized mean. Specifically, by applying our new framework on general self-normalized sum, we significantly improve Cramer type moderate deviation theorems for one-dependent random variables, geometrically beta-mixing random variables and causal processes under geometrical moment contraction. As an additional application, we also derive the Cramer type moderate deviation theorems for self-normalized winsorized mean.
We compute the best constant in the Khintchine inequality under assumption that the sum of Rademacher random variables is zero.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا