Do you want to publish a course? Click here

A spectrahedral representation of the first derivative relaxation of the positive semidefinite cone

120   0   0.0 ( 0 )
 Added by James Saunderson
 Publication date 2017
  fields
and research's language is English




Ask ChatGPT about the research

If $X$ is an $ntimes n$ symmetric matrix, then the directional derivative of $X mapsto det(X)$ in the direction $I$ is the elementary symmetric polynomial of degree $n-1$ in the eigenvalues of $X$. This is a polynomial in the entries of $X$ with the property that it is hyperbolic with respect to the direction $I$. The corresponding hyperbolicity cone is a relaxation of the positive semidefinite (PSD) cone known as the first derivative relaxation (or Renegar derivative) of the PSD cone. A spectrahedal cone is a convex cone that has a representation as the intersection of a subspace with the cone of PSD matrices in some dimension. We show that the first derivative relaxation of the PSD cone is a spectrahedral cone, and give an explicit spectrahedral description of size $binom{n+1}{2}-1$. The construction provides a new explicit example of a hyperbolicity cone that is also a spectrahedron. This is consistent with the generalized Lax conjecture, which conjectures that every hyperbolicity cone is a spectrahedron.



rate research

Read More

We consider a new and general online resource allocation problem, where the goal is to maximize a function of a positive semidefinite (PSD) matrix with a scalar budget constraint. The problem data arrives online, and the algorithm needs to make an irrevocable decision at each step. Of particular interest are classic experiment design problems in the online setting, with the algorithm deciding whether to allocate budget to each experiment as new experiments become available sequentially. We analyze two greedy primal-dual algorithms and provide bounds on their competitive ratios. Our analysis relies on a smooth surrogate of the objective function that needs to satisfy a new diminishing returns (PSD-DR) property (that its gradient is order-reversing with respect to the PSD cone). Using the representation for monotone maps on the PSD cone given by Lowners theorem, we obtain a convex parametrization of the family of functions satisfying PSD-DR. We then formulate a convex optimization problem to directly optimize our competitive ratio bound over this set. This design problem can be solved offline before the data start arriving. The online algorithm that uses the designed smoothing is tailored to the given cost function, and enjoys a competitive ratio at least as good as our optimized bound. We provide examples of computing the smooth surrogate for D-optimal and A-optimal experiment design, and demonstrate the performance of the custom-designed algorithm.
We develop techniques to construct a series of sparse polyhedral approximations of the semidefinite cone. Motivated by the semidefinite (SD) bases proposed by Tanaka and Yoshise (2018), we propose a simple expansion of SD bases so as to keep the sparsity of the matrices composing it. We prove that the polyhedral approximation using our expanded SD bases contains the set of all diagonally dominant matrices and is contained in the set of all scaled diagonally dominant matrices. We also prove that the set of all scaled diagonally dominant matrices can be expressed using an infinite number of expanded SD bases. We use our approximations as the initial approximation in cutting plane methods for solving a semidefinite relaxation of the maximum stable set problem. It is found that the proposed methods with expanded SD bases are significantly more efficient than methods using other existing approximations or solving semidefinite relaxation problems directly.
The authors in a previous paper devised certain subcones of the semidefinite plus nonnegative cone and showed that satisfaction of the requirements for membership of those subcones can be detected by solving linear optimization problems (LPs) with $O(n)$ variables and $O(n^2)$ constraints. They also devised LP-based algorithms for testing copositivity using the subcones. In this paper, they investigate the properties of the subcones in more detail and explore larger subcones of the positive semidefinite plus nonnegative cone whose satisfaction of the requirements for membership can be detected by solving LPs. They introduce a {em semidefinite basis (SD basis)} that is a basis of the space of $n times n$ symmetric matrices consisting of $n(n+1)/2$ symmetric semidefinite matrices. Using the SD basis, they devise two new subcones for which detection can be done by solving LPs with $O(n^2)$ variables and $O(n^2)$ constraints. The new subcones are larger than the ones in the previous paper and inherit their nice properties. The authors also examine the efficiency of those subcones in numerical experiments. The results show that the subcones are promising for testing copositivity as a useful application.
We consider solving high-order semidefinite programming (SDP) relaxations of nonconvex polynomial optimization problems (POPs) that admit rank-one optimal solutions. Existing approaches, which solve the SDP independently from the POP, either cannot scale to large problems or suffer from slow convergence due to the typical degeneracy of such SDPs. We propose a new algorithmic framework, called SpecTrahedral pRoximal gradIent Descent along vErtices (STRIDE), that blends fast local search on the nonconvex POP with global descent on the convex SDP. Specifically, STRIDE follows a globally convergent trajectory driven by a proximal gradient method (PGM) for solving the SDP, while simultaneously probing long, but safeguarded, rank-one strides, generated by fast nonlinear programming algorithms on the POP, to seek rapid descent. We prove STRIDE has global convergence. To solve the subproblem of projecting a given point onto the feasible set of the SDP, we reformulate the projection step as a continuously differentiable unconstrained optimization and apply a limited-memory BFGS method to achieve both scalability and accuracy. We conduct numerical experiments on solving second-order SDP relaxations arising from two important applications in machine learning and computer vision. STRIDE dominates a diverse set of five existing SDP solvers and is the only solver that can solve degenerate rank-one SDPs to high accuracy (e.g., KKT residuals below 1e-9), even in the presence of millions of equality constraints.
We study the convex relaxation of a polynomial optimization problem, maximizing a product of linear forms over the complex sphere. We show that this convex program is also a relaxation of the permanent of Hermitian positive semidefinite (HPSD) matrices. By analyzing a constructive randomized rounding algorithm, we obtain an improved multiplicative approximation factor to the permanent of HPSD matrices, as well as computationally efficient certificates for this approximation. We also propose an analog of van der Waerdens conjecture for HPSD matrices, where the polynomial optimization problem is interpreted as a relaxation of the permanent.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا