ﻻ يوجد ملخص باللغة العربية
We introduce a stochastic version of Taylors expansion and Mean Value Theorem, originally proved by Aliprantis and Border (1999), and extend them to a multivariate case. For a univariate case, the theorem asserts that suppose a real-valued function $f$ has a continuous derivative $f$ on a closed interval $I$ and $X$ is a random variable on a probability space $(Omega, mathcal{F}, P)$. Fix $a in I$, there exists a textit{random variable} $xi$ such that $xi(omega) in I$ for every $omega in Omega$ and $f(X(omega)) = f(a) + f(xi(omega))(X(omega) - a)$. The proof is not trivial. By applying these results in statistics, one may simplify some details in the proofs of the Delta method or the asymptotic properties for a maximum likelihood estimator. In particular, when mentioning there exists $theta ^ *$ between $hat{theta}$ (a maximum likelihood estimator) and $theta_0$ (the true value), a stochastic version of Mean Value Theorem guarantees $theta ^ *$ is a random variable (or a random vector).
Common statistical measures of uncertainty such as $p$-values and confidence intervals quantify the uncertainty due to sampling, that is, the uncertainty due to not observing the full population. However, sampling is not the only source of uncertaint
This paper studies the related problems of prediction, covariance estimation, and principal component analysis for the spiked covariance model with heteroscedastic noise. We consider an estimator of the principal components based on whitening the noi
In the context of stability of the extremes of a random variable X with respect to a positive integer valued random variable N we discuss the cases (i) X is exponential (ii) non-geometric laws for N (iii) identifying N for the stability of a given X
The singular value decomposition (SVD) of large-scale matrices is a key tool in data analytics and scientific computing. The rapid growth in the size of matrices further increases the need for developing efficient large-scale SVD algorithms. Randomiz
Let {(X_i,Y_i)}_{i=1}^n be a sequence of independent bivariate random vectors. In this paper, we establish a refined Cramer type moderate deviation theorem for the general self-normalized sum sum_{i=1}^n X_i/(sum_{i=1}^n Y_i^2)^{1/2}, which unifies a