Do you want to publish a course? Click here

Multi Anchor Point Shrinkage for the Sample Covariance Matrix (Extended Version)

243   0   0.0 ( 0 )
 Added by Alec Kercheval
 Publication date 2021
  fields Financial
and research's language is English




Ask ChatGPT about the research

Portfolio managers faced with limited sample sizes must use factor models to estimate the covariance matrix of a high-dimensional returns vector. For the simplest one-factor market model, success rests on the quality of the estimated leading eigenvector beta. When only the returns themselves are observed, the practitioner has available the PCA estimate equal to the leading eigenvector of the sample covariance matrix. This estimator performs poorly in various ways. To address this problem in the high-dimension, limited sample size asymptotic regime and in the context of estimating the minimum variance portfolio, Goldberg, Papanicolau, and Shkolnik developed a shrinkage method (the GPS estimator) that improves the PCA estimator of beta by shrinking it toward a constant target unit vector. In this paper we continue their work to develop a more general framework of shrinkage targets that allows the practitioner to make use of further information to improve the estimator. Examples include sector separation of stock betas, and recent information from prior estimates. We prove some precise statements and illustrate the resulting improvements over the GPS estimator with some numerical experiments.



rate research

Read More

The asymptotic normality for a large family of eigenvalue statistics of a general sample covariance matrix is derived under the ultra-high dimensional setting, that is, when the dimension to sample size ratio $p/n to infty$. Based on this CLT result, we first adapt the covariance matrix test problem to the new ultra-high dimensional context. Then as a second application, we develop a new test for the separable covariance structure of a matrix-valued white noise. Simulation experiments are conducted for the investigation of finite-sample properties of the general asymptotic normality of eigenvalue statistics, as well as the second test for separable covariance structure of matrix-valued white noise.
We establish a quantitative version of the Tracy--Widom law for the largest eigenvalue of high dimensional sample covariance matrices. To be precise, we show that the fluctuations of the largest eigenvalue of a sample covariance matrix $X^*X$ converge to its Tracy--Widom limit at a rate nearly $N^{-1/3}$, where $X$ is an $M times N$ random matrix whose entries are independent real or complex random variables, assuming that both $M$ and $N$ tend to infinity at a constant rate. This result improves the previous estimate $N^{-2/9}$ obtained by Wang [73]. Our proof relies on a Green function comparison method [27] using iterative cumulant expansions, the local laws for the Green function and asymptotic properties of the correlation kernel of the white Wishart ensemble.
We seek to improve estimates of the power spectrum covariance matrix from a limited number of simulations by employing a novel statistical technique known as shrinkage estimation. The shrinkage technique optimally combines an empirical estimate of the covariance with a model (the target) to minimize the total mean squared error compared to the true underlying covariance. We test this technique on N-body simulations and evaluate its performance by estimating cosmological parameters. Using a simple diagonal target, we show that the shrinkage estimator significantly outperforms both the empirical covariance and the target individually when using a small number of simulations. We find that reducing noise in the covariance estimate is essential for properly estimating the values of cosmological parameters as well as their confidence intervals. We extend our method to the jackknife covariance estimator and again find significant improvement, though simulations give better results. Even for thousands of simulations we still find evidence that our method improves estimation of the covariance matrix. Because our method is simple, requires negligible additional numerical effort, and produces superior results, we always advocate shrinkage estimation for the covariance of the power spectrum and other large-scale structure measurements when purely theoretical modeling of the covariance is insufficient.
We gather several results on the eigenvalues of the spatial sign covariance matrix of an elliptical distribution. It is shown that the eigenvalues are a one-to-one function of the eigenvalues of the shape matrix and that they are closer together than the latter. We further provide a one-dimensional integral representation of the eigenvalues, which facilitates their numerical computation.
We consider a $p$-dimensional time series where the dimension $p$ increases with the sample size $n$. The resulting data matrix $X$ follows a stochastic volatility model: each entry consists of a positive random volatility term multiplied by an independent noise term. The volatility multipliers introduce dependence in each row and across the rows. We study the asymptotic behavior of the eigenvalues and eigenvectors of the sample covariance matrix $XX$ under a regular variation assumption on the noise. In particular, we prove Poisson convergence for the point process of the centered and normalized eigenvalues and derive limit theory for functionals acting on them, such as the trace. We prove related results for stochastic volatility models with additional linear dependence structure and for stochastic volatility models where the time-varying volatility terms are extinguished with high probability when $n$ increases. We provide explicit approximations of the eigenvectors which are of a strikingly simple structure. The main tools for proving these results are large deviation theorems for heavy-tailed time series, advocating a unified approach to the study of the eigenstructure of heavy-tailed random matrices.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا