Do you want to publish a course? Click here

Finding Zeros of H{o}lder Metrically Subregular Mappings via Globally Convergent Levenberg-Marquardt Methods

114   0   0.0 ( 0 )
 Publication date 2018
  fields Biology
and research's language is English




Ask ChatGPT about the research

We present two globally convergent Levenberg-Marquardt methods for finding zeros of H{o}lder metrically subregular mappings that may have non-isolated zeros. The first method unifies the Levenberg- Marquardt direction and an Armijo-type line search, while the second incorporates this direction with a nonmonotone trust-region technique. For both methods, we prove the global convergence to a first-order stationary point of the associated merit function. Furthermore, the worst-case global complexity of these methods are provided, indicating that an approximate stationary point can be computed in at most $mathcal{O}(varepsilon^{-2})$ function and gradient evaluations, for an accuracy parameter $varepsilon>0$. We also study the conditions for the proposed methods to converge to a zero of the associated mappings. Computing a moiety conserved steady state for biochemical reaction networks can be cast as the problem of finding a zero of a H{o}lder metrically subregular mapping. We report encouraging numerical results for finding a zero of such mappings derived from real-world biological data, which supports our theoretical foundations.



rate research

Read More

The phase retrieval problem, where one aims to recover a complex-valued image from far-field intensity measurements, is a classic problem encountered in a range of imaging applications. Modern phase retrieval approaches usually rely on gradient descent methods in a nonlinear minimization framework. Calculating closed-form gradients for use in these methods is tedious work, and formulating second order derivatives is even more laborious. Additionally, second order techniques often require the storage and inversion of large matrices of partial derivatives, with memory requirements that can be prohibitive for data-rich imaging modalities. We use a reverse-mode automatic differentiation (AD) framework to implement an efficient matrix-free version of the Levenberg-Marquardt (LM) algorithm, a longstanding method that finds popular use in nonlinear least-square minimization problems but which has seen little use in phase retrieval. Furthermore, we extend the basic LM algorithm so that it can be applied for general constrained optimization problems beyond just the least-square applications. Since we use AD, we only need to specify the physics-based forward model for a specific imaging application; the derivative terms are calculated automatically through matrix-vector products, without explicitly forming any large Jacobian or Gauss-Newton matrices. We demonstrate that this algorithm can be used to solve both the unconstrained ptychographic object retrieval problem and the constrained blind ptychographic object and probe retrieval problems, under both the Gaussian and Poisson noise models, and that this method outperforms best-in-class first-order ptychographic reconstruction methods: it provides excellent convergence guarantees with (in many cases) a superlinear rate of convergence, all with a computational cost comparable to, or lower than, the tested first-order algorithms.
The paper proposes and justifies a new algorithm of the proximal Newton type to solve a broad class of nonsmooth composite convex optimization problems without strong convexity assumptions. Based on advanced notions and techniques of variational analysis, we establish implementable results on the global convergence of the proposed algorithm as well as its local convergence with superlinear and quadratic rates. For certain structural problems, the obtained local convergence conditions do not require the local Lipschitz continuity of the corresponding Hessian mappings that is a crucial assumption used in the literature to ensure a superlinear convergence of other algorithms of the proximal Newton type. The conducted numerical experiments of solving the $l_1$ regularized logistic regression model illustrate the possibility of applying the proposed algorithm to deal with practically important problems.
We study the finite convergence of iterative methods for solving convex feasibility problems. Our key assumptions are that the interior of the solution set is nonempty and that certain overrelaxation parameters converge to zero, but with a rate slower than any geometric sequence. Unlike other works in this area, which require divergent series of overrelaxations, our approach allows us to consider some summable series. By employing quasi-Fej{e}rian analysis in the latter case, we obtain additional asymptotic convergence guarantees, even when the interior of the solution set is empty.
The inverse problem in Acousto-Electric tomography concerns the reconstruction of the electric conductivity in a domain from knowledge of the power density function in the interior of the body. This interior power density results from currents prescribed at boundary electrodes (and can be obtained through electro-static boundary measurements together with auxiliary acoustic measurement. In Electrical Impedance Tomography, the complete electrode model is known to be the most accurate model for the forward modelling. In this paper, the reconstruction problem of Acousto-Electric tomography is posed using the (smooth) complete electrode model, and a Levenberg-Marquardt iteration is formulated in appropriate function spaces. This results in a system of partial differential equations to be solved in each iteration. To increase the computational efficiency and stability, a strategy based on both the complete electrode model and the continuum model with Dirichlet boundary condition is proposed. The system of equations is implemented numerically for a two dimensional scenario and the algorithm is tested on two different numerical phantoms, a heart and lung model and a human brain model. Several numerical experiments are carried out confirming the feasibility, accuracy and stability of the methods.
Implementations in R of classical general-purpose algorithms generally have two major limitations which make them unusable in complex problems: too loose convergence criteria and too long calculation time. By relying on a Marquardt-Levenberg algorithm (MLA), a Newton-like method particularly robust for solving local optimization problems, we provide with marqLevAlg package an efficient and general-purpose local optimizer which (i) prevents convergence to saddle points by using a stringent convergence criterion based on the relative distance to minimum/maximum in addition to the stability of the parameters and of the objective function; and (ii) reduces the computation time in complex settings by allowing parallel calculations at each iteration. We demonstrate through a variety of cases from the literature that our implementation reliably and consistently reaches the optimum (even when other optimizers fail), and also largely reduces computational time in complex settings through the example of maximum likelihood estimation of different sophisticated statistical models.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا