Do you want to publish a course? Click here

Chordal and factor-width decompositions for scalable semidefinite and polynomial optimization

81   0   0.0 ( 0 )
 Added by Yang Zheng
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Chordal and factor-width decomposition methods for semidefinite programming and polynomial optimization have recently enabled the analysis and control of large-scale linear systems and medium-scale nonlinear systems. Chordal decomposition exploits the sparsity of semidefinite matrices in a semidefinite program (SDP), in order to formulate an equivalent SDP with smaller semidefinite constraints that can be solved more efficiently. Factor-width decompositions, instead, relax or strengthen SDPs with dense semidefinite matrices into more tractable problems, trading feasibility or optimality for lower computational complexity. This article reviews recent advances in large-scale semidefinite and polynomial optimization enabled by these two types of decomposition, highlighting connections and differences between them. We also demonstrate that chordal and factor-width decompositions allow for significant computational savings on a range of classical problems from control theory, and on more recent problems from machine learning. Finally, we outline possible directions for future research that have the potential to facilitate the efficient optimization-based study of increasingly complex large-scale dynamical systems.



rate research

Read More

Semidefinite and sum-of-squares (SOS) optimization are fundamental computational tools in many areas, including linear and nonlinear systems theory. However, the scale of problems that can be addressed reliably and efficiently is still limited. In this paper, we introduce a new notion of emph{block factor-width-two matrices} and build a new hierarchy of inner and outer approximations of the cone of positive semidefinite (PSD) matrices. This notion is a block extension of the standard factor-width-two matrices, and allows for an improved inner-approximation of the PSD cone. In the context of SOS optimization, this leads to a block extension of the emph{scaled diagonally dominant sum-of-squares (SDSOS)} polynomials. By varying a matrix partition, the notion of block factor-width-two matrices can balance a trade-off between the computation scalability and solution quality for solving semidefinite and SOS optimization. Numerical experiments on large-scale instances confirm our theoretical findings.
161 - Aivar Sootla , Yang Zheng , 2019
In this paper, we introduce a set of block factor-width-two matrices, which is a generalisation of factor-width-two matrices and is a subset of positive semidefinite matrices. The set of block factor-width-two matrices is a proper cone and we compute a closed-form expression for its dual cone. We use these cones to build hierarchies of inner and outer approximations of the cone of positive semidefinite matrices. The main feature of these cones is that they enable a decomposition of a large semidefinite constraint into a number of smaller semidefinite constraints. As the main application of these classes of matrices, we envision large-scale semidefinite feasibility optimisation programs including sum-of-squares (SOS) programs. We present numerical examples from SOS optimisation showcasing the properties of this decomposition.
Semidefinite programs (SDPs) are standard convex problems that are frequently found in control and optimization applications. Interior-point methods can solve SDPs in polynomial time up to arbitrary accuracy, but scale poorly as the size of matrix variables and the number of constraints increases. To improve scalability, SDPs can be approximated with lower and upper bounds through the use of structured subsets (e.g., diagonally-dominant and scaled-diagonally dominant matrices). Meanwhile, any underlying sparsity or symmetry structure may be leveraged to form an equivalent SDP with smaller positive semidefinite constraints. In this paper, we present a notion of decomposed structured subsets}to approximate an SDP with structured subsets after an equivalent conversion. The lower/upper bounds found by approximation after conversion become tighter than the bounds obtained by approximating the original SDP directly. We apply decomposed structured subsets to semidefinite and sum-of-squares optimization problems with examples of H-infinity norm estimation and constrained polynomial optimization. An existing basis pursuit method is adapted into this framework to iteratively refine bounds.
161 - Jiawang Nie , Li Wang , Jane Ye 2015
A bilevel program is an optimization problem whose constraints involve another optimization problem. This paper studies bilevel polynomial programs (BPPs), i.e., all the functions are polynomials. We reformulate BPPs equivalently as semi-infinite polynomial programs (SIPPs), using Fritz John conditions and Jacobian representations. Combining the exchange technique and Lasserre type semidefinite relaxations, we propose numerical methods for solving both simple and general BPPs. For simple BPPs, we prove the convergence to global optimal solutions. Numerical experiments are presented to show the efficiency of proposed algorithms.
Bayesian hybrid models fuse physics-based insights with machine learning constructs to correct for systematic bias. In this paper, we compare Bayesian hybrid models against physics-based glass-box and Gaussian process black-box surrogate models. We consider ballistic firing as an illustrative case study for a Bayesian decision-making workflow. First, Bayesian calibration is performed to estimate model parameters. We then use the posterior distribution from Bayesian analysis to compute optimal firing conditions to hit a target via a single-stage stochastic program. The case study demonstrates the ability of Bayesian hybrid models to overcome systematic bias from missing physics with less data than the pure machine learning approach. Ultimately, we argue Bayesian hybrid models are an emerging paradigm for data-informed decision-making under parametric and epistemic uncertainty.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا