ترغب بنشر مسار تعليمي؟ اضغط هنا

Projections Onto Convex Sets (POCS) Based Optimization by Lifting

219   0   0.0 ( 0 )
 نشر من قبل A. Enis Cetin
 تاريخ النشر 2013
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Two new optimization techniques based on projections onto convex space (POCS) framework for solving convex and some non-convex optimization problems are presented. The dimension of the minimization problem is lifted by one and sets corresponding to the cost function are defined. If the cost function is a convex function in R^N the corresponding set is a convex set in R^(N+1). The iterative optimization approach starts with an arbitrary initial estimate in R^(N+1) and an orthogonal projection is performed onto one of the sets in a sequential manner at each step of the optimization problem. The method provides globally optimal solutions in total-variation, filtered variation, l1, and entropic cost functions. It is also experimentally observed that cost functions based on lp, p<1 can be handled by using the supporting hyperplane concept.



قيم البحث

اقرأ أيضاً

Two new optimization techniques based on projections onto convex space (POCS) framework for solving convex optimization problems are presented. The dimension of the minimization problem is lifted by one and sets corresponding to the cost function are defined. If the cost function is a convex function in R^N the corresponding set is also a convex set in R^{N+1}. The iterative optimization approach starts with an arbitrary initial estimate in R^{N+1} and an orthogonal projection is performed onto one of the sets in a sequential manner at each step of the optimization problem. The method provides globally optimal solutions in total-variation (TV), filtered variation (FV), L_1, and entropic cost functions. A new denoising algorithm using the TV framework is developed. The new algorithm does not require any of the regularization parameter adjustment. Simulation examples are presented.
This paper presents a selected tour through the theory and applications of lifts of convex sets. A lift of a convex set is a higher-dimensional convex set that projects onto the original set. Many convex sets have lifts that are dramatically simpler to describe than the original set. Finding such simple lifts has significant algorithmic implications, particularly for optimization problems. We consider both the classical case of polyhedral lifts, described by linear inequalities, as well as spectrahedral lifts, defined by linear matrix inequalities, with a focus on recent developments related to spectrahedral lifts. Given a convex set, ideally we would either like to find a (low-complexity) polyhedral or spectrahedral lift, or find an obstruction proving that no such lift is possible. To this end, we explain the connection between the existence of lifts of a convex set and certain structured factorizations of its associated slack operator. Based on this characterization, we describe a uniform approach, via sums of squares, to the construction of spectrahedral lifts of convex sets and illustrate the method on several families of examples. Finally, we discuss two flavors of obstruction to the existence of lifts: one related to facial structure, and the other related to algebraic properties of the set in question. Rather than being exhaustive, our aim is to illustrate the richness of the area. We touch on a range of different topics related to the existence of lifts, and present many examples of lifts from different areas of mathematics and its applications.
The Frank-Wolfe method solves smooth constrained convex optimization problems at a generic sublinear rate of $mathcal{O}(1/T)$, and it (or its variants) enjoys accelerated convergence rates for two fundamental classes of constraints: polytopes and st rongly-convex sets. Uniformly convex sets non-trivially subsume strongly convex sets and form a large variety of textit{curved} convex sets commonly encountered in machine learning and signal processing. For instance, the $ell_p$-balls are uniformly convex for all $p > 1$, but strongly convex for $pin]1,2]$ only. We show that these sets systematically induce accelerated convergence rates for the original Frank-Wolfe algorithm, which continuously interpolate between known rates. Our accelerated convergence rates emphasize that it is the curvature of the constraint sets -- not just their strong convexity -- that leads to accelerated convergence rates. These results also importantly highlight that the Frank-Wolfe algorithm is adaptive to much more generic constraint set structures, thus explaining faster empirical convergence. Finally, we also show accelerated convergence rates when the set is only locally uniformly convex and provide similar results in online linear optimization.
In this paper, an Entropy functional based online Adaptive Decision Fusion (EADF) framework is developed for image analysis and computer vision applications. In this framework, it is assumed that the compound algorithm consists of several sub-algorit hms each of which yielding its own decision as a real number centered around zero, representing the confidence level of that particular sub-algorithm. Decision values are linearly combined with weights which are updated online according to an active fusion method based on performing entropic projections onto convex sets describing sub-algorithms. It is assumed that there is an oracle, who is usually a human operator, providing feedback to the decision fusion method. A video based wildfire detection system is developed to evaluate the performance of the algorithm in handling the problems where data arrives sequentially. In this case, the oracle is the security guard of the forest lookout tower verifying the decision of the combined algorithm. Simulation results are presented. The EADF framework is also tested with a standard dataset.
The goal of this paper is to derive new classes of valid convex inequalities for quadratically constrained quadratic programs (QCQPs) through the technique of lifting. Our first main result shows that, for sets described by one bipartite bilinear con straint together with bounds, it is always possible to sequentially lift a seed inequality that is valid for a restriction obtained by fixing variables to their bounds, when the lifting is accomplished using affine functions of the fixed variables. In this setting, sequential lifting involves solving a non-convex nonlinear optimization problem each time a variable is lifted, just as in Mixed Integer Linear Programming. To reduce the computational burden associated with this procedure, we develop a framework based on subadditive approximations of lifting functions that permits sequence-independent lifting of seed inequalities for separable bipartite bilinear sets. In particular, this framework permits the derivation of closed-form valid inequalities. We then study a separable bipartite bilinear set where the coefficients form a minimal cover with respect to the right-hand-side. For this set, we introduce a bilinear cover inequality, which is second-order cone representable. We argue that this bilinear cover inequality is strong by showing that it yields a constant-factor approximation of the convex hull of the original set. We study its lifting function and construct a two-slope subadditive upper bound. Using this subadditive approximation, we lift fixed variable pairs in closed-form, thus deriving a lifted bilinear cover inequality that is valid for general separable bipartite bilinear sets with box constraints.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا