No Arabic abstract
Consider a $Ntimes n$ random matrix $Y_n=(Y_{ij}^{n})$ where the entries are given by $Y_{ij}^{n}=frac{sigma(i/N,j/n)}{sqrt{n}} X_{ij}^{n}$, the $X_{ij}^{n}$ being centered i.i.d. and $sigma:[0,1]^2 to (0,infty)$ being a continuous function called a variance profile. Consider now a deterministic $Ntimes n$ matrix $Lambda_n=(Lambda_{ij}^{n})$ whose non diagonal elements are zero. Denote by $Sigma_n$ the non-centered matrix $Y_n + Lambda_n$. Then under the assumption that $lim_{nto infty} frac Nn =c>0$ and $$ frac{1}{N} sum_{i=1}^{N} delta_{(frac{i}{N}, (Lambda_{ii}^n)^2)} xrightarrow[nto infty]{} H(dx,dlambda), $$ where $H$ is a probability measure, it is proven that the empirical distribution of the eigenvalues of $ Sigma_n Sigma_n^T$ converges almost surely in distribution to a non random probability measure. This measure is characterized in terms of its Stieltjes transform, which is obtained with the help of an auxiliary system of equations. This kind of results is of interest in the field of wireless communication.
Consider a $Ntimes n$ random matrix $Z_n=(Z^n_{j_1 j_2})$ where the individual entries are a realization of a properly rescaled stationary gaussian random field. The purpose of this article is to study the limiting empirical distribution of the eigenvalues of Gram random matrices such as $Z_n Z_n ^*$ and $(Z_n +A_n)(Z_n +A_n)^*$ where $A_n$ is a deterministic matrix with appropriate assumptions in the case where $nto infty$ and $frac Nn to c in (0,infty)$. The proof relies on related results for matrices with independent but not identically distributed entries and substantially differs from related works in the literature (Boutet de Monvel et al., Girko, etc.).
We prove the asymptotic independence of the empirical process $alpha_n = sqrt{n}( F_n - F)$ and the rescaled empirical distribution function $beta_n = n (F_n(tau+frac{cdot}{n})-F_n(tau))$, where $F$ is an arbitrary cdf, differentiable at some point $tau$, and $F_n$ the corresponding empricial cdf. This seems rather counterintuitive, since, for every $n in N$, there is a deterministic correspondence between $alpha_n$ and $beta_n$. Precisely, we show that the pair $(alpha_n,beta_n)$ converges in law to a limit having independent components, namely a time-transformed Brownian bridge and a two-sided Poisson process. Since these processes have jumps, in particular if $F$ itself has jumps, the Skorokhod product space $D(R) times D(R)$ is the adequate choice for modeling this convergence in. We develop a short convergence theory for $D(R) times D(R)$ by establishing the classical principle, devised by Yu. V. Prokhorov, that finite-dimensional convergence and tightness imply weak convergence. Several tightness criteria are given. Finally, the convergence of the pair $(alpha_n,beta_n)$ implies convergence of each of its components, thus, in passing, we provide a thorough proof of these known convergence results in a very general setting. In fact, the condition on $F$ to be differentiable in at least one point is only required for $beta_n$ to converge and can be further weakened.
We study the rate of convergence of the Mallows distance between the empirical distribution of a sample and the underlying population. The surprising feature of our results is that the convergence rate is slower in the discrete case than in the absolutely continuous setting. We show how the hazard function plays a significant role in these calculations. As an application, we recall that the quantity studied provides an upper bound on the distance between the bootstrap distribution of a sample mean and its true sampling distribution. Moreover, the convenient properties of the Mallows metric yield a straightforward lower bound, and therefore a relatively precise description of the asymptotic performance of the bootstrap in this problem.
We gather several results on the eigenvalues of the spatial sign covariance matrix of an elliptical distribution. It is shown that the eigenvalues are a one-to-one function of the eigenvalues of the shape matrix and that they are closer together than the latter. We further provide a one-dimensional integral representation of the eigenvalues, which facilitates their numerical computation.
We consider a sequence of identically independently distributed random samples from an absolutely continuous probability measure in one dimension with unbounded density. We establish a new rate of convergence of the $infty-$Wasserstein distance between the empirical measure of the samples and the true distribution, which extends the previous convergence result by Trilllos and Slepv{c}ev to the case that the true distribution has an unbounded density.