ترغب بنشر مسار تعليمي؟ اضغط هنا

A Globally Convergent Inexact Newton-Like Cayley Transform Method for Inverse Eigenvalue Problems

178   0   0.0 ( 0 )
 نشر من قبل Yonghui Ling
 تاريخ النشر 2013
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We propose a inexact Newton method for solving inverse eigenvalue problems (IEP). This method is globalized by employing the classical backtracking techniques. A global convergence analysis of this method is provided and the R-order convergence property is proved under some mild assumptions. Numerical examples demonstrate that the proposed method is very effective for solving the IEP with distinct eigenvalues.



قيم البحث

اقرأ أيضاً

122 - Bahman Kalantari 2020
Newtons method for polynomial root finding is one of mathematics most well-known algorithms. The method also has its shortcomings: it is undefined at critical points, it could exhibit chaotic behavior and is only guaranteed to converge locally. Based on the {it Geometric Modulus Principle} for a complex polynomial $p(z)$, together with a {it Modulus Reduction Theorem} proved here, we develop the {it Robust Newtons method} (RNM), defined everywhere with a step-size that guarantees an {it a priori} reduction in polynomial modulus in each iteration. Furthermore, we prove RNM iterates converge globally, either to a root or a critical point. Specifically, given $varepsilon $ and any seed $z_0$, in $t=O(1/varepsilon^{2})$ iterations of RNM, independent of degree of $p(z)$, either $|p(z_t)| leq varepsilon$ or $|p(z_t) p(z_t)| leq varepsilon$. By adjusting the iterates at {it near-critical points}, we describe a {it modified} RNM that necessarily convergence to a root. In combination with Smales point estimation, RNM results in a globally convergent Newtons method having a locally quadratic rate. We present sample polynomiographs that demonstrate how in contrast with Newtons method RNM smooths out the fractal boundaries of basins of attraction of roots. RNM also finds potentials in computing all roots of arbitrary degree polynomials. A particular consequence of RNM is a simple algorithm for solving cubic equations.
The paper proposes and justifies a new algorithm of the proximal Newton type to solve a broad class of nonsmooth composite convex optimization problems without strong convexity assumptions. Based on advanced notions and techniques of variational anal ysis, we establish implementable results on the global convergence of the proposed algorithm as well as its local convergence with superlinear and quadratic rates. For certain structural problems, the obtained local convergence conditions do not require the local Lipschitz continuity of the corresponding Hessian mappings that is a crucial assumption used in the literature to ensure a superlinear convergence of other algorithms of the proximal Newton type. The conducted numerical experiments of solving the $l_1$ regularized logistic regression model illustrate the possibility of applying the proposed algorithm to deal with practically important problems.
A novel orthogonalization-free method together with two specific algorithms are proposed to solve extreme eigenvalue problems. On top of gradient-based algorithms, the proposed algorithms modify the multi-column gradient such that earlier columns are decoupled from later ones. Global convergence to eigenvectors instead of eigenspace is guaranteed almost surely. Locally, algorithms converge linearly with convergence rate depending on eigengaps. Momentum acceleration, exact linesearch, and column locking are incorporated to further accelerate both algorithms and reduce their computational costs. We demonstrate the efficiency of both algorithms on several random matrices with different spectrum distribution and matrices from computational chemistry.
We present a local convergence analysis of inexact Newton-like methods for solving nonlinear equations under majorant conditions. This analysis provides an estimate of the convergence radius and a clear relationship between the majorant function, whi ch relaxes the Lipschitz continuity of the derivative, and the nonlinear operator under consideration. It also allow us to obtain some important special cases
We prove that under semi-local assumptions, the inexact Newton method with a fixed relative residual error tolerance converges Q-linearly to a zero of the non-linear operator under consideration. Using this result we show that Newton method for minim izing a self-concordant function or to find a zero of an analytic function can be implemented with a fixed relative residual error tolerance. In the absence of errors, our analysis retrieve the classical Kantorovich Theorem on Newton method.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا