ﻻ يوجد ملخص باللغة العربية
In this paper, we provide a negative answer to a long-standing open problem on the compatibility of Spearmans rho matrices. Following an equivalence of Spearmans rho matrices and linear correlation matrices for dimensions up to 9 in the literature, we show non-equivalence for dimensions 12 or higher. In particular, we connect this problem with the existence of a random vector under some linear projection restrictions in two characterization results.
Consider the problem of estimating a low-rank matrix when its entries are perturbed by Gaussian noise. If the empirical distribution of the entries of the spikes is known, optimal estimators that exploit this knowledge can substantially outperform si
Chatterjee (2021)s ingenious approach to estimating a measure of dependence first proposed by Dette et al. (2013) based on simple rank statistics has quickly caught attention. This measure of dependence has the unusual property of being between 0 and
Recovering low-rank structures via eigenvector perturbation analysis is a common problem in statistical machine learning, such as in factor analysis, community detection, ranking, matrix completion, among others. While a large variety of bounds are a
Olkin [3] obtained a neat upper bound for the determinant of a correlation matrix. In this note, we present an extension and improvement of his result.
Consider a normal vector $mathbf{z}=(mathbf{x},mathbf{y})$, consisting of two sub-vectors $mathbf{x}$ and $mathbf{y}$ with dimensions $p$ and $q$ respectively. With $n$ independent observations of $mathbf{z}$ at hand, we study the correlation between