ترغب بنشر مسار تعليمي؟ اضغط هنا

A refined determinantal inequality for correlation matrices

97   0   0.0 ( 0 )
 نشر من قبل Niushan Gao
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Olkin [3] obtained a neat upper bound for the determinant of a correlation matrix. In this note, we present an extension and improvement of his result.



قيم البحث

اقرأ أيضاً

In this paper, we provide a negative answer to a long-standing open problem on the compatibility of Spearmans rho matrices. Following an equivalence of Spearmans rho matrices and linear correlation matrices for dimensions up to 9 in the literature, w e show non-equivalence for dimensions 12 or higher. In particular, we connect this problem with the existence of a random vector under some linear projection restrictions in two characterization results.
323 - Minghua Lin 2014
Let $T=begin{bmatrix} X &Y 0 & Zend{bmatrix}$ be an $n$-square matrix, where $X, Z$ are $r$-square and $(n-r)$-square, respectively. Among other determinantal inequalities, it is proved $det(I_n+T^*T)ge det(I_r+X^*X)cdot det(I_{n-r}+Z^*Z)$ with equality holds if and only if $Y=0$.
198 - Song Xi Chen , Bin Guo , Yumou Qiu 2019
We consider testing the equality of two high-dimensional covariance matrices by carrying out a multi-level thresholding procedure, which is designed to detect sparse and faint differences between the covariances. A novel U-statistic composition is de veloped to establish the asymptotic distribution of the thresholding statistics in conjunction with the matrix blocking and the coupling techniques. We propose a multi-thresholding test that is shown to be powerful in detecting sparse and weak differences between two covariance matrices. The test is shown to have attractive detection boundary and to attain the optimal minimax rate in the signal strength under different regimes of high dimensionality and the sparsity of the signal. Simulation studies are conducted to demonstrate the utility of the proposed test.
For some estimations and predictions, we solve minimization problems with asymmetric loss functions. Usually, we estimate the coefficient of regression for these problems. In this paper, we do not make such the estimation, but rather give a solution by correcting any predictions so that the prediction error follows a general normal distribution. In our method, we can not only minimize the expected value of the asymmetric loss, but also lower the variance of the loss.
Consider a standard white Wishart matrix with parameters $n$ and $p$. Motivated by applications in high-dimensional statistics and signal processing, we perform asymptotic analysis on the maxima and minima of the eigenvalues of all the $m times m$ pr incipal minors, under the asymptotic regime that $n,p,m$ go to infinity. Asymptotic results concerning extreme eigenvalues of principal minors of real Wigner matrices are also obtained. In addition, we discuss an application of the theoretical results to the construction of compressed sensing matrices, which provides insights to compressed sensing in signal processing and high dimensional linear regression in statistics.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا