ترغب بنشر مسار تعليمي؟ اضغط هنا

A Note on Taylors Expansion and Mean Value Theorem With Respect to a Random Variable

67   0   0.0 ( 0 )
 نشر من قبل Yifan Yang
 تاريخ النشر 2021
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

We introduce a stochastic version of Taylors expansion and Mean Value Theorem, originally proved by Aliprantis and Border (1999), and extend them to a multivariate case. For a univariate case, the theorem asserts that suppose a real-valued function $f$ has a continuous derivative $f$ on a closed interval $I$ and $X$ is a random variable on a probability space $(Omega, mathcal{F}, P)$. Fix $a in I$, there exists a textit{random variable} $xi$ such that $xi(omega) in I$ for every $omega in Omega$ and $f(X(omega)) = f(a) + f(xi(omega))(X(omega) - a)$. The proof is not trivial. By applying these results in statistics, one may simplify some details in the proofs of the Delta method or the asymptotic properties for a maximum likelihood estimator. In particular, when mentioning there exists $theta ^ *$ between $hat{theta}$ (a maximum likelihood estimator) and $theta_0$ (the true value), a stochastic version of Mean Value Theorem guarantees $theta ^ *$ is a random variable (or a random vector).



قيم البحث

اقرأ أيضاً

Common statistical measures of uncertainty such as $p$-values and confidence intervals quantify the uncertainty due to sampling, that is, the uncertainty due to not observing the full population. However, sampling is not the only source of uncertaint y. In practice, distributions change between locations and across time. This makes it difficult to gather knowledge that transfers across data sets. We propose a measure of uncertainty or instability that quantifies the distributional instability of a statistical parameter with respect to Kullback-Leibler divergence, that is, the sensitivity of the parameter under general distributional perturbations within a Kullback-Leibler divergence ball. In addition, we propose measures to elucidate the instability of parameters with respect to directional or variable-specific shifts. Measuring instability with respect to directional shifts can be used to detect the type of shifts a parameter is sensitive to. We discuss how such knowledge can inform data collection for improved estimation of statistical parameters under shifted distributions. We evaluate the performance of the proposed measure on real data and show that it can elucidate the distributional (in-)stability of a parameter with respect to certain shifts and can be used to improve the accuracy of estimation under shifted distributions.
This paper studies the related problems of prediction, covariance estimation, and principal component analysis for the spiked covariance model with heteroscedastic noise. We consider an estimator of the principal components based on whitening the noi se, and we derive optimal singular value and eigenvalue shrinkers for use with these estimated principal components. Underlying these methods are new asymptotic results for the high-dimensional spiked model with heteroscedastic noise, and consistent estimators for the relevant population parameters. We extend previous analysis on out-of-sample prediction to the setting of predictors with whitening. We demonstrate certain advantages of noise whitening. Specifically, we show that in a certain asymptotic regime, optimal singular value shrinkage with whitening converges to the best linear predictor, whereas without whitening it converges to a suboptimal linear predictor. We prove that for generic signals, whitening improves estimation of the principal components, and increases a natural signal-to-noise ratio of the observations. We also show that for rank one signals, our estimated principal components achieve the asymptotic minimax rate.
In the context of stability of the extremes of a random variable X with respect to a positive integer valued random variable N we discuss the cases (i) X is exponential (ii) non-geometric laws for N (iii) identifying N for the stability of a given X and (iv) extending the notion to a discrete random variable X.
The singular value decomposition (SVD) of large-scale matrices is a key tool in data analytics and scientific computing. The rapid growth in the size of matrices further increases the need for developing efficient large-scale SVD algorithms. Randomiz ed SVD based on one-time sketching has been studied, and its potential has been demonstrated for computing a low-rank SVD. Instead of exploring different single random sketching techniques, we propose a Monte Carlo type integrated SVD algorithm based on multiple random sketches. The proposed integration algorithm takes multiple random sketches and then integrates the results obtained from the multiple sketched subspaces. So that the integrated SVD can achieve higher accuracy and lower stochastic variations. The main component of the integration is an optimization problem with a matrix Stiefel manifold constraint. The optimization problem is solved using Kolmogorov-Nagumo-type averages. Our theoretical analyses show that the singular vectors can be induced by population averaging and ensure the consistencies between the computed and true subspaces and singular vectors. Statistical analysis further proves a strong Law of Large Numbers and gives a rate of convergence by the Central Limit Theorem. Preliminary numerical results suggest that the proposed integrated SVD algorithm is promising.
Let {(X_i,Y_i)}_{i=1}^n be a sequence of independent bivariate random vectors. In this paper, we establish a refined Cramer type moderate deviation theorem for the general self-normalized sum sum_{i=1}^n X_i/(sum_{i=1}^n Y_i^2)^{1/2}, which unifies a nd extends the classical Cramer (1938) theorem and the self-normalized Cramer type moderate deviation theorems by Jing, Shao and Wang (2003) as well as the further refined version by Wang (2011). The advantage of our result is evidenced through successful applications to weakly dependent random variables and self-normalized winsorized mean. Specifically, by applying our new framework on general self-normalized sum, we significantly improve Cramer type moderate deviation theorems for one-dependent random variables, geometrically beta-mixing random variables and causal processes under geometrical moment contraction. As an additional application, we also derive the Cramer type moderate deviation theorems for self-normalized winsorized mean.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا