ترغب بنشر مسار تعليمي؟ اضغط هنا

Local Optimality Conditions for a Class of Hidden Convex Optimization

97   0   0.0 ( 0 )
 نشر من قبل Yong Xia
 تاريخ النشر 2021
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Hidden convex optimization is such a class of nonconvex optimization problems that can be globally solved in polynomial time via equivalent convex programming reformulations. In this paper, we focus on checking local optimality in hidden convex optimization. We first introduce a class of hidden convex optimization problems by jointing the classical nonconvex trust-region subproblem (TRS) with convex optimization (CO), and then present a comprehensive study on local optimality conditions. In order to guarantee the existence of a necessary and sufficient condition for local optimality, we need more restrictive assumptions. To our surprise, while (TRS) has at most one local non-global minimizer and (CO) has no local non-global minimizer, their joint problem could have more than one local non-global minimizer.

قيم البحث

اقرأ أيضاً

In this paper we study second-order optimality conditions for non-convex set-constrained optimization problems. For a convex set-constrained optimization problem, it is well-known that second-order optimality conditions involve the support function o f the second-order tangent set. In this paper we propose two approaches for establishing second-order optimality conditions for the non-convex case. In the first approach we extend the concept of the support function so that it is applicable to general non-convex set-constrained problems, whereas in the second approach we introduce the notion of the directional regular tangent cone and apply classical results of convex duality theory. Besides the second-order optimality conditions, the novelty of our approach lies in the systematic introduction and use, respectively, of direction
116 - Yu-HOng Dai , Liwei Zhang 2020
Minimax optimization problems arises from both modern machine learning including generative adversarial networks, adversarial training and multi-agent reinforcement learning, as well as from tradition research areas such as saddle point problems, num erical partial differential equations and optimality conditions of equality constrained optimization. For the unconstrained continuous nonconvex-nonconcave situation, Jin, Netrapalli and Jordan (2019) carefully considered the very basic question: what is a proper definition of local optima of a minimax optimization problem, and proposed a proper definition of local optimality called local minimax. We shall extend the definition of local minimax point to constrained nonconvex-nonconcave minimax optimization problems. By analyzing Jacobian uniqueness conditions for the lower-level maximization problem and the strong regularity of Karush-Kuhn-Tucker conditions of the maximization problem, we provide both necessary optimality conditions and sufficient optimality conditions for the local minimax points of constrained minimax optimization problems.
183 - N.H. Chieu , J.W. Feng , W. Gao 2017
In this paper, we introduce a new class of nonsmooth convex functions called SOS-convex semialgebraic functions extending the recently proposed notion of SOS-convex polynomials. This class of nonsmooth convex functions covers many common nonsmooth fu nctions arising in the applications such as the Euclidean norm, the maximum eigenvalue function and the least squares functions with $ell_1$-regularization or elastic net regularization used in statistics and compressed sensing. We show that, under commonly used strict feasibility conditions, the optimal value and an optimal solution of SOS-convex semi-algebraic programs can be found by solving a single semi-definite programming problem (SDP). We achieve the results by using tools from semi-algebraic geometry, convex-concave minimax theorem and a recently established Jensen inequality type result for SOS-convex polynomials. As an application, we outline how the derived results can be applied to show that robust SOS-convex optimization problems under restricted spectrahedron data uncertainty enjoy exact SDP relaxations. This extends the existing exact SDP relaxation result for restricted ellipsoidal data uncertainty and answers the open questions left in [Optimization Letters 9, 1-18(2015)] on how to recover a robust solution from the semi-definite programming relaxation in this broader setting.
We prove the following uniqueness result for the buckling plate. Assume there exists a smooth domain which minimizes the first buckling eigenvalue for a plate among all smooth domains of given volume. Then the domain must be a ball. The proof uses th e second variation for the buckling eigenvalue and an inequality by L. E. Payne to establish this result.
135 - Kuang Bai , Jane Ye 2020
The bilevel program is an optimization problem where the constraint involves solutions to a parametric optimization problem. It is well-known that the value function reformulation provides an equivalent single-level optimization problem but it result s in a nonsmooth optimization problem which never satisfies the usual constraint qualification such as the Mangasarian-Fromovitz constraint qualification (MFCQ). In this paper we show that even the first order sufficient condition for metric subregularity (which is in general weaker than MFCQ) fails at each feasible point of the bilevel program. We introduce the concept of directional calmness condition and show that under {the} directional calmness condition, the directional necessary optimality condition holds. {While the directional optimality condition is in general sharper than the non-directional one,} the directional calmness condition is in general weaker than the classical calmness condition and hence is more likely to hold. {We perform the directional sensitivity analysis of the value function and} propose the directional quasi-normality as a sufficient condition for the directional calmness. An example is given to show that the directional quasi-normality condition may hold for the bilevel program.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا