ترغب بنشر مسار تعليمي؟ اضغط هنا

Polyhedral approximations of the semidefinite cone and their application

68   0   0.0 ( 0 )
 نشر من قبل Akiko Yoshise
 تاريخ النشر 2019
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We develop techniques to construct a series of sparse polyhedral approximations of the semidefinite cone. Motivated by the semidefinite (SD) bases proposed by Tanaka and Yoshise (2018), we propose a simple expansion of SD bases so as to keep the sparsity of the matrices composing it. We prove that the polyhedral approximation using our expanded SD bases contains the set of all diagonally dominant matrices and is contained in the set of all scaled diagonally dominant matrices. We also prove that the set of all scaled diagonally dominant matrices can be expressed using an infinite number of expanded SD bases. We use our approximations as the initial approximation in cutting plane methods for solving a semidefinite relaxation of the maximum stable set problem. It is found that the proposed methods with expanded SD bases are significantly more efficient than methods using other existing approximations or solving semidefinite relaxation problems directly.

قيم البحث

اقرأ أيضاً

The matrix logarithm, when applied to Hermitian positive definite matrices, is concave with respect to the positive semidefinite order. This operator concavity property leads to numerous concavity and convexity results for other matrix functions, man y of which are of importance in quantum information theory. In this paper we show how to approximate the matrix logarithm with functions that preserve operator concavity and can be described using the feasible regions of semidefinite optimization problems of fairly small size. Such approximations allow us to use off-the-shelf semidefinite optimization solvers for convex optimization problems involving the matrix logarithm and related functions, such as the quantum relative entropy. The basic ingredients of our approach apply, beyond the matrix logarithm, to functions that are operator concave and operator monotone. As such, we introduce strategies for constructing semidefinite approximations that we expect will be useful, more generally, for studying the approximation power of functions with small semidefinite representations.
The authors in a previous paper devised certain subcones of the semidefinite plus nonnegative cone and showed that satisfaction of the requirements for membership of those subcones can be detected by solving linear optimization problems (LPs) with $O (n)$ variables and $O(n^2)$ constraints. They also devised LP-based algorithms for testing copositivity using the subcones. In this paper, they investigate the properties of the subcones in more detail and explore larger subcones of the positive semidefinite plus nonnegative cone whose satisfaction of the requirements for membership can be detected by solving LPs. They introduce a {em semidefinite basis (SD basis)} that is a basis of the space of $n times n$ symmetric matrices consisting of $n(n+1)/2$ symmetric semidefinite matrices. Using the SD basis, they devise two new subcones for which detection can be done by solving LPs with $O(n^2)$ variables and $O(n^2)$ constraints. The new subcones are larger than the ones in the previous paper and inherit their nice properties. The authors also examine the efficiency of those subcones in numerical experiments. The results show that the subcones are promising for testing copositivity as a useful application.
119 - James Saunderson 2017
If $X$ is an $ntimes n$ symmetric matrix, then the directional derivative of $X mapsto det(X)$ in the direction $I$ is the elementary symmetric polynomial of degree $n-1$ in the eigenvalues of $X$. This is a polynomial in the entries of $X$ with the property that it is hyperbolic with respect to the direction $I$. The corresponding hyperbolicity cone is a relaxation of the positive semidefinite (PSD) cone known as the first derivative relaxation (or Renegar derivative) of the PSD cone. A spectrahedal cone is a convex cone that has a representation as the intersection of a subspace with the cone of PSD matrices in some dimension. We show that the first derivative relaxation of the PSD cone is a spectrahedral cone, and give an explicit spectrahedral description of size $binom{n+1}{2}-1$. The construction provides a new explicit example of a hyperbolicity cone that is also a spectrahedron. This is consistent with the generalized Lax conjecture, which conjectures that every hyperbolicity cone is a spectrahedron.
The Lyapunov rank of a proper cone $K$ in a finite dimensional real Hilbert space is defined as the dimension of the space of all Lyapunov-like transformations on $K$, or equivalently, the dimension of the Lie algebra of the automorphism group of $K$ . This (rank) measures the number of linearly independent bilinear relations needed to express a complementarity system on $K$ (that arises, for example, from a linear program or a complementarity problem on the cone). Motivated by the problem of describing spectral/proper cones where the complementarity system can be expressed as a square system (that is, where the Lyapunov rank is greater than equal to the dimension of the ambient space), we consider proper polyhedral cones in $mathbb{R}^n$ that are permutation invariant. For such cones we show that the Lyapunov rank is either 1 (in which case, the cone is irreducible) or n (in which case, the cone is isomorphic to the nonnegative orthart in $mathbb{R}^n$). In the latter case, we show that the corresponding spectral cone is isomorphic to a symmetric cone.
We consider a new and general online resource allocation problem, where the goal is to maximize a function of a positive semidefinite (PSD) matrix with a scalar budget constraint. The problem data arrives online, and the algorithm needs to make an ir revocable decision at each step. Of particular interest are classic experiment design problems in the online setting, with the algorithm deciding whether to allocate budget to each experiment as new experiments become available sequentially. We analyze two greedy primal-dual algorithms and provide bounds on their competitive ratios. Our analysis relies on a smooth surrogate of the objective function that needs to satisfy a new diminishing returns (PSD-DR) property (that its gradient is order-reversing with respect to the PSD cone). Using the representation for monotone maps on the PSD cone given by Lowners theorem, we obtain a convex parametrization of the family of functions satisfying PSD-DR. We then formulate a convex optimization problem to directly optimize our competitive ratio bound over this set. This design problem can be solved offline before the data start arriving. The online algorithm that uses the designed smoothing is tailored to the given cost function, and enjoys a competitive ratio at least as good as our optimized bound. We provide examples of computing the smooth surrogate for D-optimal and A-optimal experiment design, and demonstrate the performance of the custom-designed algorithm.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا