ترغب بنشر مسار تعليمي؟ اضغط هنا

Convex Generalized Nash Equilibrium Problems and Polynomial Optimization

118   0   0.0 ( 0 )
 نشر من قبل Jiawang Nie
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

This paper studies convex Generalized Nash Equilibrium Problems (GNEPs) that are given by polynomials. We use rational and parametric expressions for Lagrange multipliers to formulate efficient polynomial optimization for computing Generalized Nash Equilibria (GNEs). The Moment-SOS hierarchy of semidefinite relaxations are used to solve the polynomial optimization. Under some general assumptions, we prove the method can find a GNE if there exists one, or detect nonexistence of GNEs. Numerical experiments are presented to show the efficiency of the method.



قيم البحث

اقرأ أيضاً

119 - Jiawang Nie , Xindong Tang 2020
This paper studies Nash equilibrium problems that are given by polynomial functions. We formulate efficient polynomial optimization problems for computing Nash equilibria. The Lasserre type Moment-SOS relaxations are used to solve them. Under generic assumptions, the method can find a Nash equilibrium if there is one. Moreover, it can find all Nash equilibria if there are finitely many ones of them. The method can also detect nonexistence if there is no Nash equilibrium.
This paper concerns the generalized Nash equilibrium problem of polynomials (GNEPP). We apply the Gauss-Seidel method and Lasserre type Moment-SOS relaxations to solve GNEPPs. The convergence of the Gauss-Seidel method is known for some special GNEPP s, such as generalized potential games (GPGs). We give a sufficient condition for GPGs and propose a numerical certificate, based on Putinars Positivstellensatz. Numerical examples for both convex and nonconvex GNEPPs are given for demonstrating the efficiency of the proposed method.
In this paper we present a new algorithmic realization of a projection-based scheme for general convex constrained optimization problem. The general idea is to transform the original optimization problem to a sequence of feasibility problems by itera tively constraining the objective function from above until the feasibility problem is inconsistent. For each of the feasibility problems one may apply any of the existing projection methods for solving it. In particular, the scheme allows the use of subgradient projections and does not require exact projections onto the constraints sets as in existing similar methods. We also apply the newly introduced concept of superiorization to optimization formulation and compare its performance to our scheme. We provide some numerical results for convex quadratic test problems as well as for real-life optimization problems coming from medical treatment planning.
158 - M. Journee , F. Bach , P.-A. Absil 2008
We propose an algorithm for solving nonlinear convex programs defined in terms of a symmetric positive semidefinite matrix variable $X$. This algorithm rests on the factorization $X=Y Y^T$, where the number of columns of Y fixes the rank of $X$. It i s thus very effective for solving programs that have a low rank solution. The factorization $X=Y Y^T$ evokes a reformulation of the original problem as an optimization on a particular quotient manifold. The present paper discusses the geometry of that manifold and derives a second order optimization method. It furthermore provides some conditions on the rank of the factorization to ensure equivalence with the original problem. The efficiency of the proposed algorithm is illustrated on two applications: the maximal cut of a graph and the sparse principal component analysis problem.
We describe a modular rewriting system for translating optimization problems written in a domain-specific language to forms compatible with low-level solver interfaces. Translation is facilitated by reductions, which accept a category of problems and transform instances of that category to equivalent instances of another category. Our system proceeds in two key phases: analysis, in which we attempt to find a suitable solver for a supplied problem, and canonicalization, in which we rewrite the problem in the selected solvers standard form. We implement the described system in version 1.0 of CVXPY, a domain-specific language for mathematical and especially convex optimization. By treating reductions as first-class objects, our method makes it easy to match problems to solvers well-suited for them and to support solvers with a wide variety of standard forms.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا