Do you want to publish a course? Click here

Tracy-Widom law for the extreme eigenvalues of large signal-plus-noise matrices

106   0   0.0 ( 0 )
 Added by Guangming Pan
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Let $bY =bR+bX$ be an $Mtimes N$ matrix, where $bR$ is a rectangular diagonal matrix and $bX$ consists of $i.i.d.$ entries. This is a signal-plus-noise type model. Its signal matrix could be full rank, which is rarely studied in literature compared with the low rank cases. This paper is to study the extreme eigenvalues of $bYbY^*$. We show that under the high dimensional setting ($M/Nrightarrow cin(0,1]$) and some regularity conditions on $bR$ the rescaled extreme eigenvalue converges in distribution to Tracy-Widom distribution ($TW_1$).



rate research

Read More

230 - Qinwen Wang , Jianfeng Yao 2015
Consider two $p$-variate populations, not necessarily Gaussian, with covariance matrices $Sigma_1$ and $Sigma_2$, respectively, and let $S_1$ and $S_2$ be the sample covariances matrices from samples of the populations with degrees of freedom $T$ and $n$, respectively. When the difference $Delta$ between $Sigma_1$ and $Sigma_2$ is of small rank compared to $p,T$ and $n$, the Fisher matrix $F=S_2^{-1}S_1$ is called a {em spiked Fisher matrix}. When $p,T$ and $n$ grow to infinity proportionally, we establish a phase transition for the extreme eigenvalues of $F$: when the eigenvalues of $Delta$ ({em spikes}) are above (or under) a critical value, the associated extreme eigenvalues of the Fisher matrix will converge to some point outside the support of the global limit (LSD) of other eigenvalues; otherwise, they will converge to the edge points of the LSD. Furthermore, we derive central limit theorems for these extreme eigenvalues of the spiked Fisher matrix. The limiting distributions are found to be Gaussian if and only if the corresponding population spike eigenvalues in $Delta$ are {em simple}. Numerical examples are provided to demonstrate the finite sample performance of the results. In addition to classical applications of a Fisher matrix in high-dimensional data analysis, we propose a new method for the detection of signals allowing an arbitrary covariance structure of the noise. Simulation experiments are conducted to illustrate the performance of this detector.
83 - Zhigang Bao 2017
In this paper, we study a high-dimensional random matrix model from nonparametric statistics called the Kendall rank correlation matrix, which is a natural multivariate extension of the Kendall rank correlation coefficient. We establish the Tracy-Widom law for its largest eigenvalue. It is the first Tracy-Widom law for a nonparametric random matrix model, and also the first Tracy-Widom law for a high-dimensional U-statistic.
We consider general high-dimensional spiked sample covariance models and show that their leading sample spiked eigenvalues and their linear spectral statistics are asymptotically independent when the sample size and dimension are proportional to each other. As a byproduct, we also establish the central limit theorem of the leading sample spiked eigenvalues by removing the block diagonal assumption on the population covariance matrix, which is commonly needed in the literature. Moreover, we propose consistent estimators of the $L_4$ norm of the spiked population eigenvectors. Based on these results, we develop a new statistic to test the equality of two spiked population covariance matrices. Numerical studies show that the new test procedure is more powerful than some existing methods.
Consider a standard white Wishart matrix with parameters $n$ and $p$. Motivated by applications in high-dimensional statistics and signal processing, we perform asymptotic analysis on the maxima and minima of the eigenvalues of all the $m times m$ principal minors, under the asymptotic regime that $n,p,m$ go to infinity. Asymptotic results concerning extreme eigenvalues of principal minors of real Wigner matrices are also obtained. In addition, we discuss an application of the theoretical results to the construction of compressed sensing matrices, which provides insights to compressed sensing in signal processing and high dimensional linear regression in statistics.
145 - Chen Wang , Baisuo Jin , Z. D. Bai 2013
The auto-cross covariance matrix is defined as [mathbf{M}_n=frac{1} {2T}sum_{j=1}^Tbigl(mathbf{e}_jmathbf{e}_{j+tau}^*+mathbf{e}_{j+ tau}mathbf{e}_j^*bigr),] where $mathbf{e}_j$s are $n$-dimensional vectors of independent standard complex components with a common mean 0, variance $sigma^2$, and uniformly bounded $2+eta$th moments and $tau$ is the lag. Jin et al. [Ann. Appl. Probab. 24 (2014) 1199-1225] has proved that the LSD of $mathbf{M}_n$ exists uniquely and nonrandomly, and independent of $tau$ for all $tauge 1$. And in addition they gave an analytic expression of the LSD. As a continuation of Jin et al. [Ann. Appl. Probab. 24 (2014) 1199-1225], this paper proved that under the condition of uniformly bounded fourth moments, in any closed interval outside the support of the LSD, with probability 1 there will be no eigenvalues of $mathbf{M}_n$ for all large $n$. As a consequence of the main theorem, the limits of the largest and smallest eigenvalue of $mathbf{M}_n$ are also obtained.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا