ﻻ يوجد ملخص باللغة العربية
This paper discusses a fundamental problem in compressed sensing: the sparse recoverability of L1 minimization with an arbitrary sensing matrix. We develop an new accumulative score function (ASF) to provide a lower bound for the recoverable sparsity level (SL) of a sensing matrix while preserving a low computational complexity. We first define a score function for each row of a matrix, and then ASF sums up large scores until the total score reaches 0.5. Interestingly, the number of involved rows in the summation is a reliable lower bound of SL. It is further proved that ASF provides a sharper bound for SL than coherence We also investigate the underlying relationship between the new ASF and the classical RIC and achieve a RIC-based bound for SL.
We focus on the high dimensional linear regression $Ysimmathcal{N}(Xbeta^{*},sigma^{2}I_{n})$, where $beta^{*}inmathds{R}^{p}$ is the parameter of interest. In this setting, several estimators such as the LASSO and the Dantzig Selector are known to s
Let ${bf R}$ be the Pearson correlation matrix of $m$ normal random variables. The Raos score test for the independence hypothesis $H_0 : {bf R} = {bf I}_m$, where ${bf I}_m$ is the identity matrix of dimension $m$, was first considered by Schott (20
This paper considers Bayesian multiple testing under sparsity for polynomial-tailed distributions satisfying a monotone likelihood ratio property. Included in this class of distributions are the Students t, the Pareto, and many other distributions. W
Here in this paper, it is tried to obtain and compare the ML estimations based on upper record values and a random sample. In continue, some theorems have been proven about the behavior of these estimations asymptotically.
This article studies global testing of the slope function in functional linear regression model in the framework of reproducing kernel Hilbert space. We propose a new testing statistic based on smoothness regularization estimators. The asymptotic dis