ترغب بنشر مسار تعليمي؟ اضغط هنا

Projections onto sets are used in a wide variety of methods in optimization theory but not every method that uses projections really belongs to the class of projection methods as we mean it here. Here projection methods are iterative algorithms that use projections onto sets while relying on the general principle that when a family of (usually closed and convex) sets is present then projections (or approximate projections) onto the given individual sets are easier to perform than projections onto other sets (intersections, image sets under some transformation, etc.) that are derived from the given family of individual sets. Projection methods employ projections (or approximate projections) onto convex sets in various ways. They may use different kinds of projections and, sometimes, even use different projections within the same algorithm. They serve to solve a variety of problems which are either of the feasibility or the optimization types. They have different algorithmic structures, of which some are particularly suitable for parallel computing, and they demonstrate nice convergence properties and/or good initial behavior patterns. This class of algorithms has witnessed great progress in recent years and its member algorithms have been applied with success to many scientific, technological, and mathematical problems. This annotated bibliography includes books and review papers on, or related to, projection methods that we know about, use, and like. If you know of books or review papers that should be added to this list please contact us.
139 - Yair Censor , Daniel Reem 2014
The convex feasibility problem (CFP) is at the core of the modeling of many problems in various areas of science. Subgradient projection methods are important tools for solving the CFP because they enable the use of subgradient calculations instead o f orthogonal projections onto the individual sets of the problem. Working in a real Hilbert space, we show that the sequential subgradient projection method is perturbation resilient. By this we mean that under appropriate conditions the sequence generated by the method converges weakly, and sometimes also strongly, to a point in the intersection of the given subsets of the feasibility problem, despite certain perturbations which are allowed in each iterative step. Unlike previous works on solving the convex feasibility problem, the involved functions, which induce the feasibility problems subsets, need not be convex. Instead, we allow them to belong to a wider and richer class of functions satisfying a weaker condition, that we call zero-convexity. This class, which is introduced and discussed here, holds a promise to solve optimization problems in various areas, especially in non-smooth and non-convex optimization. The relevance of this study to approximate minimization and to the recent superiorization methodology for constrained optimization is explained.
We consider sequential iterative processes for the common fixed point problem of families of cutter operators on a Hilbert space. These are operators that have the property that, for any point xinH, the hyperplane through Tx whose normal is x-Tx alwa ys cuts the space into two half-spaces one of which contains the point x while the other contains the (assumed nonempty) fixed point set of T. We define and study generalized relaxations and extrapolation of cutter operators and construct extrapolated cyclic cutter operators. In this framework we investigate the Dos Santos local acceleration method in a unified manner and adopt it to a composition of cutters. For these we conduct convergence analysis of successive iteration algorithms.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا