ﻻ يوجد ملخص باللغة العربية
We present a gradient-based algorithm for unconstrained minimization derived from iterated linear change of basis. The new method is equivalent to linear conjugate gradient in the case of a quadratic objective function. In the case of exact line search it is a secant method. In practice, it performs comparably to BFGS and DFP and is sometimes more robust.
This paper describes an extension of the BFGS and L-BFGS methods for the minimization of a nonlinear function subject to errors. This work is motivated by applications that contain computational noise, employ low-precision arithmetic, or are subject
Quadratic Unconstrained Binary Optimization models are useful for solving a diverse range of optimization problems. Constraints can be added by incorporating quadratic penalty terms into the objective, often with the introduction of slack variables n
We consider minimization of functions that are compositions of convex or prox-regular functions (possibly extended-valued) with smooth vector functions. A wide variety of important optimization problems fall into this framework. We describe an algori
The travelling salesman problem (TSP) of space trajectory design is complicated by its complex structure design space. The graph based tree search and stochastic seeding combinatorial approaches are commonly employed to tackle the time-dependent TSP
Although application examples of multilevel optimization have already been discussed since the 90s, the development of solution methods was almost limited to bilevel cases due to the difficulty of the problem. In recent years, in machine learning, Fr