ترغب بنشر مسار تعليمي؟ اضغط هنا

Projection Onto A Simplex

183   0   0.0 ( 0 )
 نشر من قبل Xiaojing Ye
 تاريخ النشر 2011
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

This mini-paper presents a fast and simple algorithm to compute the projection onto the canonical simplex $triangle^n$. Utilizing the Moreaus identity, we show that the problem is essentially a univariate minimization and the objective function is strictly convex and continuously differentiable. Moreover, it is shown that there are at most n candidates which can be computed explicitly, and the minimizer is the only one that falls into the correct interval.


قيم البحث

اقرأ أيضاً

89 - Youwei Liang 2020
An important method to optimize a function on standard simplex is the active set algorithm, which requires the gradient of the function to be projected onto a hyperplane, with sign constraints on the variables that lie in the boundary of the simplex. We propose a new algorithm to efficiently project the gradient for this purpose. Furthermore, we apply the proposed gradient projection method to quadratic programs (QP) with standard simplex constraints, where gradient projection is used to explore the feasible region and, when we believe the optimal active set is identified, we switch to constrained conjugate gradient to accelerate convergence. Specifically, two different directions of gradient projection are used to explore the simplex, namely, the projected gradient and the reduced gradient. We choose one of the two directions according to the angle between the directions. Moreover, we propose two conditions for guessing the optimal active set heuristically. The first condition is that the working set remains unchanged for many iterations, and the second condition is that the angle between the projected gradient and the reduced gradient is small enough. Based on these strategies, a new active set algorithm for solving quadratic programs on standard simplex is proposed.
138 - Yaguang Yang , Fabio Vitor 2021
A double pivot algorithm that combines features of two recently published papers by these authors is proposed. The proposed algorithm is implemented in MATLAB. The MATLAB code is tested, along with a MATLAB implementation of Dantzigs algorithm, for s everal test sets, including a set of cycling LP problems, Klee-Mintys problems, randomly generated linear programming (LP) problems, and Netlib benchmark problems. The test result shows that the proposed algorithm is (a) degenerate-tolerance as we expected, and (b) more efficient than Dantzigs algorithm for large size randomly generated LP problems but less efficient for Netlib benchmark problems and small size randomly generated problems in terms of CPU time.
57 - A.J. Roberts 2001
Modern dynamical systems theory has previously had little to say about finite difference and finite element approximations of partial differential equations (Archilla, 1998). However, recently I have shown one way that centre manifold theory may be u sed to create and support the spatial discretisation of pde{}s such as Burgers equation (Roberts, 1998a) and the Kuramoto-Sivashinsky equation (MacKenzie, 2000). In this paper the geometric view of a centre manifold is used to provide correct initial conditions for numerical discretisations (Roberts, 1997). The derived projection of initial conditions follows from the physical processes expressed in the PDEs and so is appropriately conservative. This rational approach increases the accuracy of forecasts made with finite difference models.
The computational complexity of a problem arising in the context of sparse optimization is considered, namely, the projection onto the set of $k$-cosparse vectors w.r.t. some given matrix $Omeg$. It is shown that this projection problem is (strongly) NP-hard, even in the special cases in which the matrix $Omeg$ contains only ternary or bipolar coefficients. Interestingly, this is in contrast to the projection onto the set of $k$-sparse vectors, which is trivially solved by keeping only the $k$ largest coefficients.
Two new optimization techniques based on projections onto convex space (POCS) framework for solving convex optimization problems are presented. The dimension of the minimization problem is lifted by one and sets corresponding to the cost function are defined. If the cost function is a convex function in R^N the corresponding set is also a convex set in R^{N+1}. The iterative optimization approach starts with an arbitrary initial estimate in R^{N+1} and an orthogonal projection is performed onto one of the sets in a sequential manner at each step of the optimization problem. The method provides globally optimal solutions in total-variation (TV), filtered variation (FV), L_1, and entropic cost functions. A new denoising algorithm using the TV framework is developed. The new algorithm does not require any of the regularization parameter adjustment. Simulation examples are presented.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا