ﻻ يوجد ملخص باللغة العربية
The estimation of high dimensional precision matrices has been a central topic in statistical learning. However, as the number of parameters scales quadratically with the dimension $p$, many state-of-the-art methods do not scale well to solve problems with a very large $p$. In this paper, we propose a very efficient algorithm for precision matrix estimation via penalized quadratic loss functions. Under the high dimension low sample size setting, the computation complexity of our algorithm is linear in both the sample size and the number of parameters. Such a computation complexity is in some sense optimal, as it is the same as the complexity needed for computing the sample covariance matrix. Numerical studies show that our algorithm is much more efficient than other state-of-the-art methods when the dimension $p$ is very large.
In this paper, we estimate the high dimensional precision matrix under the weak sparsity condition where many entries are nearly zero. We study a Lasso-type method for high dimensional precision matrix estimation and derive general error bounds under
Maximum likelihood estimation of mixture proportions has a long history, and continues to play an important role in modern statistics, including in development of nonparametric empirical Bayes methods. Maximum likelihood of mixture proportions has tr
Estimation of the precision matrix (or inverse covariance matrix) is of great importance in statistical data analysis. However, as the number of parameters scales quadratically with the dimension p, computation becomes very challenging when p is larg
The smoothly clipped absolute deviation (SCAD) and the minimax concave penalty (MCP) penalized regression models are two important and widely used nonconvex sparse learning tools that can handle variable selection and parameter estimation simultaneou
Quadratic regression goes beyond the linear model by simultaneously including main effects and interactions between the covariates. The problem of interaction estimation in high dimensional quadratic regression has received extensive attention in the