ﻻ يوجد ملخص باللغة العربية
We study support recovery for a $k times k$ principal submatrix with elevated mean $lambda/N$, hidden in an $Ntimes N$ symmetric mean zero Gaussian matrix. Here $lambda>0$ is a universal constant, and we assume $k = N rho$ for some constant $rho in (0,1)$. We establish that {there exists a constant $C>0$ such that} the MLE recovers a constant proportion of the hidden submatrix if $lambda {geq C} sqrt{frac{1}{rho} log frac{1}{rho}}$, {while such recovery is information theoretically impossible if $lambda = o( sqrt{frac{1}{rho} log frac{1}{rho}} )$}. The MLE is computationally intractable in general, and in fact, for $rho>0$ sufficiently small, this problem is conjectured to exhibit a emph{statistical-computational gap}. To provide rigorous evidence for this, we study the likelihood landscape for this problem, and establish that for some $varepsilon>0$ and $sqrt{frac{1}{rho} log frac{1}{rho} } ll lambda ll frac{1}{rho^{1/2 + varepsilon}}$, the problem exhibits a variant of the emph{Overlap-Gap-Property (OGP)}. As a direct consequence, we establish that a family of local MCMC based algorithms do not achieve optimal recovery. Finally, we establish that for $lambda > 1/rho$, a simple spectral method recovers a constant proportion of the hidden submatrix.
We study a variant of the sparse PCA (principal component analysis) problem in the hard regime, where the inference task is possible yet no polynomial-time algorithm is known to exist. Prior work, based on the low-degree likelihood ratio, has conject
We present AMPS, an augmented matrix approach to update the solution to a linear system of equations when the matrix is modified by a few elements within a principal submatrix. This problem arises in the dynamic security analysis of a power grid, whe
For some Maltsev conditions $Sigma$ it is enough to check if a finite algebra $mathbf A$ satisfies $Sigma$ locally on subsets of bounded size, in order to decide, whether $mathbf A$ satisfies $Sigma$ (globally). This local-global property is the main
Data analyses based on linear methods constitute the simplest, most robust, and transparent approaches to the automatic processing of large amounts of data for building supervised or unsupervised machine learning models. Principal covariates regressi
We present an encoding of a polynomial system into vanishing and non-vanishing constraints on almost-principal minors of a symmetric, principally regular matrix, such that the solvability of the system over some field is equivalent to the satisfiabil