Do you want to publish a course? Click here

Asymptotic Analysis for Extreme Eigenvalues of Principal Minors of Random Matrices

160   0   0.0 ( 0 )
 Added by Xiaoou Li
 Publication date 2019
  fields
and research's language is English




Ask ChatGPT about the research

Consider a standard white Wishart matrix with parameters $n$ and $p$. Motivated by applications in high-dimensional statistics and signal processing, we perform asymptotic analysis on the maxima and minima of the eigenvalues of all the $m times m$ principal minors, under the asymptotic regime that $n,p,m$ go to infinity. Asymptotic results concerning extreme eigenvalues of principal minors of real Wigner matrices are also obtained. In addition, we discuss an application of the theoretical results to the construction of compressed sensing matrices, which provides insights to compressed sensing in signal processing and high dimensional linear regression in statistics.



rate research

Read More

We provide some asymptotic theory for the largest eigenvalues of a sample covariance matrix of a p-dimensional time series where the dimension p = p_n converges to infinity when the sample size n increases. We give a short overview of the literature on the topic both in the light- and heavy-tailed cases when the data have finite (infinite) fourth moment, respectively. Our main focus is on the heavytailed case. In this case, one has a theory for the point process of the normalized eigenvalues of the sample covariance matrix in the iid case but also when rows and columns of the data are linearly dependent. We provide limit results for the weak convergence of these point processes to Poisson or cluster Poisson processes. Based on this convergence we can also derive the limit laws of various function als of the ordered eigenvalues such as the joint convergence of a finite number of the largest order statistics, the joint limit law of the largest eigenvalue and the trace, limit laws for successive ratios of ordered eigenvalues, etc. We also develop some limit theory for the singular values of the sample autocovariance matrices and their sums of squares. The theory is illustrated for simulated data and for the components of the S&P 500 stock index.
Recovering low-rank structures via eigenvector perturbation analysis is a common problem in statistical machine learning, such as in factor analysis, community detection, ranking, matrix completion, among others. While a large variety of bounds are available for average errors between empirical and population statistics of eigenvectors, few results are tight for entrywise analyses, which are critical for a number of problems such as community detection. This paper investigates entrywise behaviors of eigenvectors for a large class of random matrices whose expectations are low-rank, which helps settle the conjecture in Abbe et al. (2014b) that the spectral algorithm achieves exact recovery in the stochastic block model without any trimming or cleaning steps. The key is a first-order approximation of eigenvectors under the $ell_infty$ norm: $$u_k approx frac{A u_k^*}{lambda_k^*},$$ where ${u_k}$ and ${u_k^*}$ are eigenvectors of a random matrix $A$ and its expectation $mathbb{E} A$, respectively. The fact that the approximation is both tight and linear in $A$ facilitates sharp comparisons between $u_k$ and $u_k^*$. In particular, it allows for comparing the signs of $u_k$ and $u_k^*$ even if $| u_k - u_k^*|_{infty}$ is large. The results are further extended to perturbations of eigenspaces, yielding new $ell_infty$-type bounds for synchronization ($mathbb{Z}_2$-spiked Wigner model) and noisy matrix completion.
109 - Tobias Boege 2021
We present an encoding of a polynomial system into vanishing and non-vanishing constraints on almost-principal minors of a symmetric, principally regular matrix, such that the solvability of the system over some field is equivalent to the satisfiability of the constraints over that field. This implies two complexity results about Gaussian conditional independence structures. First, all real algebraic numbers are necessary to construct inhabitants of non-empty Gaussian statistical models defined by conditional independence and dependence constraints. This gives a negative answer to a question of Petr v{S}imev{c}ek. Second, we prove that the implication problem for Gaussian CI is polynomial-time equivalent to the existential theory of the reals.
In this paper, we study the asymptotic behavior of the extreme eigenvalues and eigenvectors of the spiked covariance matrices, in the supercritical regime. Specifically, we derive the joint distribution of the extreme eigenvalues and the generalized components of their associated eigenvectors in this regime.
286 - Piero Barone 2010
Pencils of Hankel matrices whose elements have a joint Gaussian distribution with nonzero mean and not identical covariance are considered. An approximation to the distribution of the squared modulus of their determinant is computed which allows to get a closed form approximation of the condensed density of the generalized eigenvalues of the pencils. Implications of this result for solving several moments problems are discussed and some numerical examples are provided.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا