Do you want to publish a course? Click here

Restricted Isometry Property under High Correlations

120   0   0.0 ( 0 )
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

Matrices satisfying the Restricted Isometry Property (RIP) play an important role in the areas of compressed sensing and statistical learning. RIP matrices with optimal parameters are mainly obtained via probabilistic arguments, as explicit constructions seem hard. It is therefore interesting to ask whether a fixed matrix can be incorporated into a construction of restricted isometries. In this paper, we construct a new broad ensemble of random matrices with dependent entries that satisfy the restricted isometry property. Our construction starts with a fixed (deterministic) matrix $X$ satisfying some simple stable rank condition, and we show that the matrix $XR$, where $R$ is a random matrix drawn from various popular probabilistic models (including, subgaussian, sparse, low-randomness, satisfying convex concentration property), satisfies the RIP with high probability. These theorems have various applications in signal recovery, random matrix theory, dimensionality reduction, etc. Additionally, motivated by an application for understanding the effectiveness of word vector embeddings popular in natural language processing and machine learning applications, we investigate the RIP of the matrix $XR^{(l)}$ where $R^{(l)}$ is formed by taking all possible (disregarding order) $l$-way entrywise products of the columns of a random matrix $R$.



rate research

Read More

This paper is concerned with the performance of Orthogonal Matching Pursuit (OMP) algorithms applied to a dictionary $mathcal{D}$ in a Hilbert space $mathcal{H}$. Given an element $fin mathcal{H}$, OMP generates a sequence of approximations $f_n$, $n=1,2,dots$, each of which is a linear combination of $n$ dictionary elements chosen by a greedy criterion. It is studied whether the approximations $f_n$ are in some sense comparable to {em best $n$ term approximation} from the dictionary. One important result related to this question is a theorem of Zhang cite{TZ} in the context of sparse recovery of finite dimensional signals. This theorem shows that OMP exactly recovers $n$-sparse signal, whenever the dictionary $mathcal{D}$ satisfies a Restricted Isometry Property (RIP) of order $An$ for some constant $A$, and that the procedure is also stable in $ell^2$ under measurement noise. The main contribution of the present paper is to give a structurally simpler proof of Zhangs theorem, formulated in the general context of $n$ term approximation from a dictionary in arbitrary Hilbert spaces $mathcal{H}$. Namely, it is shown that OMP generates near best $n$ term approximations under a similar RIP condition.
123 - Akshay Soni , Yashar Mehdad 2017
The multilabel learning problem with large number of labels, features, and data-points has generated a tremendous interest recently. A recurring theme of these problems is that only a few labels are active in any given datapoint as compared to the total number of labels. However, only a small number of existing work take direct advantage of this inherent extreme sparsity in the label space. By the virtue of Restricted Isometry Property (RIP), satisfied by many random ensembles, we propose a novel procedure for multilabel learning known as RIPML. During the training phase, in RIPML, labels are projected onto a random low-dimensional subspace followed by solving a least-square problem in this subspace. Inference is done by a k-nearest neighbor (kNN) based approach. We demonstrate the effectiveness of RIPML by conducting extensive simulations and comparing results with the state-of-the-art linear dimensionality reduction based approaches.
203 - Simone Brugiapaglia 2018
We propose a compressive spectral collocation method for the numerical approximation of Partial Differential Equations (PDEs). The approach is based on a spectral Sturm-Liouville approximation of the solution and on the collocation of the PDE in strong form at randomized points, by taking advantage of the compressive sensing principle. The proposed approach makes use of a number of collocation points substantially less than the number of basis functions when the solution to recover is sparse or compressible. Focusing on the case of the diffusion equation, we prove that, under suitable assumptions on the diffusion coefficient, the matrix associated with the compressive spectral collocation approach satisfies the restricted isometry property of compressive sensing with high probability. Moreover, we demonstrate the ability of the proposed method to reduce the computational cost associated with the corresponding full spectral collocation approach while preserving good accuracy through numerical illustrations.
When the linear measurements of an instance of low-rank matrix recovery satisfy a restricted isometry property (RIP)---i.e. they are approximately norm-preserving---the problem is known to contain no spurious local minima, so exact recovery is guaranteed. In this paper, we show that moderate RIP is not enough to eliminate spurious local minima, so existing results can only hold for near-perfect RIP. In fact, counterexamples are ubiquitous: we prove that every x is the spurious local minimum of a rank-1 instance of matrix recovery that satisfies RIP. One specific counterexample has RIP constant $delta=1/2$, but causes randomly initialized stochastic gradient descent (SGD) to fail 12% of the time. SGD is frequently able to avoid and escape spurious local minima, but this empirical result shows that it can occasionally be defeated by their existence. Hence, while exact recovery guarantees will likely require a proof of no spurious local minima, arguments based solely on norm preservation will only be applicable to a narrow set of nearly-isotropic instances.
Nonconvex matrix recovery is known to contain no spurious local minima under a restricted isometry property (RIP) with a sufficiently small RIP constant $delta$. If $delta$ is too large, however, then counterexamples containing spurious local minima are known to exist. In this paper, we introduce a proof technique that is capable of establishing sharp thresholds on $delta$ to guarantee the inexistence of spurious local minima. Using the technique, we prove that in the case of a rank-1 ground truth, an RIP constant of $delta<1/2$ is both necessary and sufficient for exact recovery from any arbitrary initial point (such as a random point). We also prove a local recovery result: given an initial point $x_{0}$ satisfying $f(x_{0})le(1-delta)^{2}f(0)$, any descent algorithm that converges to second-order optimality guarantees exact recovery.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا