A Krylov-Schur Algorithm for Large Eigenproblems

@article{Stewart2002AKA,
  title={A Krylov-Schur Algorithm for Large Eigenproblems},
  author={G. W. Stewart},
  journal={SIAM J. Matrix Anal. Appl.},
  year={2002},
  volume={23},
  pages={601-614}
}
  • G. Stewart
  • Published 1 March 2001
  • Computer Science
  • SIAM J. Matrix Anal. Appl.
Sorensen's implicitly restarted Arnoldi algorithm is one of the most successful and flexible methods for finding a few eigenpairs of a large matrix. However, the need to preserve the structure of the Arnoldi decomposition on which the algorithm is based restricts the range of transformations that can be performed on the decomposition. In consequence, it is difficult to deflate converged Ritz vectors from the decomposition. Moreover, the potential forward instability of the implicit QR algorithm… 
A periodic Krylov-Schur algorithm for large matrix products
TLDR
A variant of the Krylov-Schur algorithm suitable for addressing eigenvalue problems associated with products of large and sparse matrices is described, capable of achieving qualitatively better approximations to eigenvalues of small magnitude.
Block Krylov–Schur method for large symmetric eigenvalue problems
TLDR
A block version of the Krylov–Schur algorithm for symmetric eigenproblems, including how to handle rank deficient cases and how to use varying block sizes is developed.
A Krylov-Schur Algorithm for Matrix Products ?
TLDR
A variant of the Krylov-Schur algorithm suitable for addressing eigenvalue problems associated with products of large and sparse matrices is described, capable of achieving qualitatively better approximations to the eigenvalues of small magnitude.
A new framework for implicit restarting of the Krylov–Schur algorithm
TLDR
It is shown that restarting with arbitrary polynomial filter is possible by reassigning some of the eigenvalues of the Rayleigh quotient through a rank‐one correction, implemented using only the elementary transformations of the Krylov decomposition.
An implicit filter for rational Krylov using core transformations
A restarted Induced Dimension Reduction method to approximate eigenpairs of large unsymmetric matrices
The Arnoldi Eigenvalue Iteration with Exact Shifts Can Fail
  • M. Embree
  • Computer Science
    SIAM J. Matrix Anal. Appl.
  • 2009
TLDR
The present note describes a class of examples for which the restarted Arnoldi algorithm fails in the strongest possible sense; that is, the polynomial filter used to restart the iteration deflates the eigenspace one is attempting to compute.
A Chebyshev-Davidson Algorithm for Large Symmetric Eigenproblems
TLDR
A polynomial filtered Davidson-type algorithm is proposed for symmetric eigenproblems, in which the correction-equation of the Davidson approach is replaced by a polynometric filtering step, which has the effect of reducing both the number of steps required for convergence and the cost in orthogonalizations and restarts.
A Restarted Krylov Subspace Method for the Evaluation of Matrix Functions
TLDR
The Arnoldi algorithm for approximating a function of a matrix times a vector can be restarted in a manner analogous to restarted Krylov subspace methods for solving linear systems of equations and inherits the superlinear convergence property of its unrestarted counterpart for entire functions.
...
...

References

SHOWING 1-10 OF 16 REFERENCES
On restarting the Arnoldi method for large nonsymmetric eigenvalue problems
TLDR
It is shown why Sorensen's implicit QR approach is generally far superior to the others and why Ritz vectors are combined in precisely the right way for an effective new starting vector.
Jacobi-Davidson Style QR and QZ Algorithms for the Reduction of Matrix Pencils
TLDR
Two algorithms, JDQZ for the generalized eigen problem and JDQR for the standard eigenproblem, that are based on the iterative construction of a (generalized) partial Schur form are presented, suitable for the efficient computation of several eigenvalues and the corresponding eigenvectors near a user-specified target value in the complex plane.
GMRES with Deflated Restarting
  • R. Morgan
  • Computer Science
    SIAM J. Sci. Comput.
  • 2002
TLDR
The deflation of small eigenvalues can greatly improve the convergence of restarted GMRES and it is demonstrated that using harmonic Ritz vectors is important because then the whole subspace is a Krylov subspace that contains certain important smaller subspaces.
Deflation Techniques for an Implicitly Restarted Arnoldi Iteration
TLDR
A deflation procedure is introduced that is designed to improve the convergence of an implicitly restarted Arnoldi iteration for computing a few eigenvalues of a large matrix and implicitly deflates the converged approximations from the iteration.
Implicit Application of Polynomial Filters in a k-Step Arnoldi Method
  • D. Sorensen
  • Computer Science
    SIAM J. Matrix Anal. Appl.
  • 1992
TLDR
The iterative scheme is shown to be a truncation of the standard implicitly shifted QR-iteration for dense problems and it avoids the need to explicitly restart the Arnoldi sequence.
Thick-Restart Lanczos Method for Large Symmetric Eigenvalue Problems
TLDR
A restarted variant of the Lanczos method for symmetric eigenvalue problems named the thick-restart Lanczo method is proposed, able to retain an arbitrary number of Ritz vectors from the previous iterations with a minimal restarting cost.
Dynamic Thick Restarting of the Davidson, and the Implicitly Restarted Arnoldi Methods
TLDR
It is proved that thick restarted, nonpreconditioned Davidson is equivalent to the implicitly restarted Arnoldi and motivates the development of a dynamic thick restarting scheme for the symmetric case, which can be used in both Davidson and implicit restarting Arnoldi.
The principle of minimized iterations in the solution of the matrix eigenvalue problem
An interpretation of Dr. Cornelius Lanczos' iteration method, which he has named "minimized iterations", is discussed in this article, expounding the method as applied to the solution of the
Forward Stability and Transmission of Shifts in the QR Algorithm
TLDR
This paper shows that tiny subdiagonal entries do not normally cause forward instability or interfere in any way with the convergence of the algorithm, and even in situations where forward instability does occur, the QR step is not normally rendered ineffective.
...
...