ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantum-inspired sublinear algorithm for solving low-rank semidefinite programming

309   0   0.0 ( 0 )
 نشر من قبل Tongyang Li
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Semidefinite programming (SDP) is a central topic in mathematical optimization with extensive studies on its efficient solvers. In this paper, we present a proof-of-principle sublinear-time algorithm for solving SDPs with low-rank constraints; specifically, given an SDP with $m$ constraint matrices, each of dimension $n$ and rank $r$, our algorithm can compute any entry and efficient descriptions of the spectral decomposition of the solution matrix. The algorithm runs in time $O(mcdotmathrm{poly}(log n,r,1/varepsilon))$ given access to a sampling-based low-overhead data structure for the constraint matrices, where $varepsilon$ is the precision of the solution. In addition, we apply our algorithm to a quantum state learning task as an application. Technically, our approach aligns with 1) SDP solvers based on the matrix multiplicative weight (MMW) framework by Arora and Kale [TOC 12]; 2) sampling-based dequantizing framework pioneered by Tang [STOC 19]. In order to compute the matrix exponential required in the MMW framework, we introduce two new techniques that may be of independent interest: $bullet$ Weighted sampling: assuming sampling access to each individual constraint matrix $A_{1},ldots,A_{tau}$, we propose a procedure that gives a good approximation of $A=A_{1}+cdots+A_{tau}$. $bullet$ Symmetric approximation: we propose a sampling procedure that gives the emph{spectral decomposition} of a low-rank Hermitian matrix $A$. To the best of our knowledge, this is the first sampling-based algorithm for spectral decomposition, as previous works only give singular values and vectors.



قيم البحث

اقرأ أيضاً

We present classical sublinear-time algorithms for solving low-rank linear systems of equations. Our algorithms are inspired by the HHL quantum algorithm for solving linear systems and the recent breakthrough by Tang of dequantizing the quantum algor ithm for recommendation systems. Let $A in mathbb{C}^{m times n}$ be a rank-$k$ matrix, and $b in mathbb{C}^m$ be a vector. We present two algorithms: a sampling algorithm that provides a sample from $A^{-1}b$ and a query algorithm that outputs an estimate of an entry of $A^{-1}b$, where $A^{-1}$ denotes the Moore-Penrose pseudo-inverse. Both of our algorithms have query and time complexity $O(mathrm{poly}(k, kappa, |A|_F, 1/epsilon),mathrm{polylog}(m, n))$, where $kappa$ is the condition number of $A$ and $epsilon$ is the precision parameter. Note that the algorithms we consider are sublinear time, so they cannot write and read the whole matrix or vectors. In this paper, we assume that $A$ and $b$ come with well-known low-overhead data structures such that entries of $A$ and $b$ can be sampled according to some natural probability distributions. Alternatively, when $A$ is positive semidefinite, our algorithms can be adapted so that the sampling assumption on $b$ is not required.
Semidefinite Programming (SDP) is a class of convex optimization programs with vast applications in control theory, quantum information, combinatorial optimization and operational research. Noisy intermediate-scale quantum (NISQ) algorithms aim to ma ke an efficient use of the current generation of quantum hardware. However, optimizing variational quantum algorithms is a challenge as it is an NP-hard problem that in general requires an exponential time to solve and can contain many far from optimal local minima. Here, we present a current term NISQ algorithm for SDP. The classical optimization program of our NISQ solver is another SDP over a smaller dimensional ansatz space. We harness the SDP based formulation of the Hamiltonian ground state problem to design a NISQ eigensolver. Unlike variational quantum eigensolvers, the classical optimization program of our eigensolver is convex, can be solved in polynomial time with the number of ansatz parameters and every local minimum is a global minimum. Further, we demonstrate the potential of our NISQ SDP solver by finding the largest eigenvalue of up to $2^{1000}$ dimensional matrices and solving graph problems related to quantum contextuality. We also discuss NISQ algorithms for rank-constrained SDPs. Our work extends the application of NISQ computers onto one of the most successful algorithmic frameworks of the past few decades.
We present an algorithmic framework for quantum-inspired classical algorithms on close-to-low-rank matrices, generalizing the series of results started by Tangs breakthrough quantum-inspired algorithm for recommendation systems [STOC19]. Motivated by quantum linear algebra algorithms and the quantum singular value transformation (SVT) framework of Gilyen et al. [STOC19], we develop classical algorithms for SVT that run in time independent of input dimension, under suitable quantum-inspired sampling assumptions. Our results give compelling evidence that in the corresponding QRAM data structure input model, quantum SVT does not yield exponential quantum speedups. Since the quantum SVT framework generalizes essentially all known techniques for quantum linear algebra, our results, combined with sampling lemmas from previous work, suffice to generalize all recent results about dequantizing quantum machine learning algorithms. In particular, our classical SVT framework recovers and often improves the dequantization results on recommendation systems, principal component analysis, supervised clustering, support vector machines, low-rank regression, and semidefinite program solving. We also give additional dequantization results on low-rank Hamiltonian simulation and discriminant analysis. Our improvements come from identifying the key feature of the quantum-inspired input model that is at the core of all prior quantum-inspired results: $ell^2$-norm sampling can approximate matrix products in time independent of their dimension. We reduce all our main results to this fact, making our exposition concise, self-contained, and intuitive.
In this paper, we show that the bundle method can be applied to solve semidefinite programming problems with a low rank solution without ever constructing a full matrix. To accomplish this, we use recent results from randomly sketching matrix optimiz ation problems and from the analysis of bundle methods. Under strong duality and strict complementarity of SDP, our algorithm produces primal and the dual sequences converging in feasibility at a rate of $tilde{O}(1/epsilon)$ and in optimality at a rate of $tilde{O}(1/epsilon^2)$. Moreover, our algorithm outputs a low rank representation of its approximate solution with distance to the optimal solution at most $O(sqrt{epsilon})$ within $tilde{O}(1/epsilon^2)$ iterations.
Semidefinite programs (SDPs) are a fundamental class of optimization problems with important recent applications in approximation algorithms, quantum complexity, robust learning, algorithmic rounding, and adversarial deep learning. This paper present s a faster interior point method to solve generic SDPs with variable size $n times n$ and $m$ constraints in time begin{align*} widetilde{O}(sqrt{n}( mn^2 + m^omega + n^omega) log(1 / epsilon) ), end{align*} where $omega$ is the exponent of matrix multiplication and $epsilon$ is the relative accuracy. In the predominant case of $m geq n$, our runtime outperforms that of the previous fastest SDP solver, which is based on the cutting plane method of Jiang, Lee, Song, and Wong [JLSW20]. Our algorithms runtime can be naturally interpreted as follows: $widetilde{O}(sqrt{n} log (1/epsilon))$ is the number of iterations needed for our interior point method, $mn^2$ is the input size, and $m^omega + n^omega$ is the time to invert the Hessian and slack matrix in each iteration. These constitute natural barriers to further improving the runtime of interior point methods for solving generic SDPs.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا