ﻻ يوجد ملخص باللغة العربية
The application of the lasso is espoused in high-dimensional settings where only a small number of the regression coefficients are believed to be nonzero. Moreover, statistical properties of high-dimensional lasso estimators are often proved under the assumption that the correlation between the predictors is bounded. In this vein, coordinatewise methods, the most common means of computing the lasso solution, work well in the presence of low to moderate multicollinearity. The computational speed of coordinatewise algorithms degrades however as sparsity decreases and multicollinearity increases. Motivated by these limitations, we propose the novel Deterministic Bayesian Lasso algorithm for computing the lasso solution. This algorithm is developed by considering a limiting version of the Bayesian lasso. The performance of the Deterministic Bayesian Lasso improves as sparsity decreases and multicollinearity increases, and can offer substantial increases in computational speed. A rigorous theoretical analysis demonstrates that (1) the Deterministic Bayesian Lasso algorithm converges to the lasso solution, and (2) it leads to a representation of the lasso estimator which shows how it achieves both $ell_1$ and $ell_2$ types of shrinkage simultaneously. Connections to other algorithms are also provided. The benefits of the Deterministic Bayesian Lasso algorithm are then illustrated on simulated and real data.
A robust estimator is proposed for the parameters that characterize the linear regression problem. It is based on the notion of shrinkages, often used in Finance and previously studied for outlier detection in multivariate data. A thorough simulation
Among the most popular variable selection procedures in high-dimensional regression, Lasso provides a solution path to rank the variables and determines a cut-off position on the path to select variables and estimate coefficients. In this paper, we c
Comment on ``Gibbs Sampling, Exponential Families, and Orthogonal Polynomials [arXiv:0808.3852]
The issue of honesty in constructing confidence sets arises in nonparametric regression. While optimal rate in nonparametric estimation can be achieved and utilized to construct sharp confidence sets, severe degradation of confidence level often happ
Wavelet shrinkage estimators are widely applied in several fields of science for denoising data in wavelet domain by reducing the magnitudes of empirical coefficients. In nonparametric regression problem, most of the shrinkage rules are derived from