Do you want to publish a course? Click here

Toward Efficient and Stable Polynomial Preconditioning for GMRES

358   0   0.0 ( 0 )
 Added by Jennifer Loe
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

We present a polynomial preconditioner for solving large systems of linear equations. The polynomial is derived from the minimum residual polynomial and is straightforward to compute and implement. It this paper, we study the polynomial preconditioner applied to GMRES; however it could be used with any Krylov solver. Stability control using added roots allows for high degree polynomials. We discuss the effectiveness and challenges of root-adding and give an additional check for stability. This polynomial preconditioning algorithm can dramatically improve convergence for difficult problems and can reduce dot products by an even greater margin.



rate research

Read More

120 - Kirk M. Soodhalter 2014
We present two minimum residual methods for solving sequences of shifted linear systems, the right-preconditioned shifted GMRES and shifted recycled GMRES algorithms which use a seed projection strategy often employed to solve multiple related problems. These methods are compatible with general preconditioning of all systems, and when restricted to right preconditioning, require no extra applications of the operator or preconditioner. These seed projection methods perform a minimum residual iteration for the base system while improving the approximations for the shifted systems at little additional cost. The iteration continues until the base system approximation is of satisfactory quality. The method is then recursively called for the remaining unconverged systems. We present both methods inside of a general framework which allows these techniques to be extended to the setting of flexible preconditioning and inexact Krylov methods. We present some analysis of such methods and numerical experiments demonstrating the effectiveness of the algorithms we have derived.
Polynomial preconditioning with the GMRES minimal residual polynomial has the potential to greatly reduce orthogonalization costs, making it useful for communication reduction. We implement polynomial preconditioning in the Belos package from Trilinos and show how it can be effective in both serial and parallel implementations. We further show it is a communication-avoiding technique and is a viable option to CA-GMRES for large-scale parallel computing.
GMRES is one of the most popular iterative methods for the solution of large linear systems of equations that arise from the discretization of linear well-posed problems, such as Dirichlet boundary value problems for elliptic partial differential equations. The method is also applied to iteratively solve linear systems of equations that are obtained by discretizing linear ill-posed problems, such as many inverse problems. However, GMRES does not always perform well when applied to the latter kind of problems. This paper seeks to shed some light on reasons for the poor performance of GMRES in certain situations, and discusses some remedies based on specific kinds of preconditioning. The standard implementation of GMRES is based on the Arnoldi process, which also can be used to define a solution subspace for Tikhonov or TSVD regularization, giving rise to the Arnoldi-Tikhonov and Arnoldi-TSVD methods, respectively. The performance of the GMRES, the Arnoldi-Tikhonov, and the Arnoldi-TSVD methods is discussed. Numerical examples illustrate properties of these methods.
It is well-established that any non-increasing convergence curve is possible for GMRES and a family of pairs $(A,b)$ can be constructed for which GMRES exhibits a given convergence curve with $A$ having arbitrary spectrum. No analog of this result has been established for block GMRES, wherein multiple right-hand sides are considered. By reframing the problem as a single linear system over a ring of square matrices, we develop convergence results for block Arnoldi and block GMRES. In particular, we show what convergence behavior is admissible for block GMRES and how the matrices and right-hand sides producing any admissible behavior can be constructed. Moreover, we show that the convergence of the block Arnoldi method for eigenvalue approximation can be almost fully independent of the convergence of block GMRES for the same coefficient matrix and the same starting vectors.
Consider using the right-preconditioned generalized minimal residual (AB-GMRES) method, which is an efficient method for solving underdetermined least squares problems. Morikuni (Ph.D. thesis, 2013) showed that for some inconsistent and ill-conditioned problems, the iterates of the AB-GMRES method may diverge. This is mainly because the Hessenberg matrix in the GMRES method becomes very ill-conditioned so that the backward substitution of the resulting triangular system becomes numerically unstable. We propose a stabilized GMRES based on solving the normal equations corresponding to the above triangular system using the standard Cholesky decomposition. This has the effect of shifting upwards the tiny singular values of the Hessenberg matrix which lead to an inaccurate solution. Thus, the process becomes numerically stable and the system becomes consistent, rendering better convergence and a more accurate solution. Numerical experiments show that the proposed method is robust and efficient for solving inconsistent and ill-conditioned underdetermined least squares problems. The method can be considered as a way of making the GMRES stable for highly ill-conditioned inconsistent problems.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا