Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems under Limited Memory. Part I: Seeking One Eigenvalue

Abstract

Large, sparse, Hermitian eigenvalue problems are still some of the most computationally challenging tasks. Despite the need for a robust, nearly optimal preconditioned iterative method that can operate under severe memory limitations, no such method has surfaced as a clear winner. In this research we approach the eigenproblem from the nonlinear perspective that helps us develop two nearly optimal methods. The first extends the recent Jacobi-Davidson-Conjugate-Gradient (JDCG) method to JDQMR, improving robustness and efficiency. The second method, Generalized-Davidson+1 (GD+1), utilizes the locally optimal Conjugate Gradient recurrence as a restarting technique to achieve almost optimal convergence. We describe both methods within a unifying framework, and provide theoretical justification for their near optimality. A choice between the most efficient of the two can be made at runtime. Our extensive experiments confirm the robustness, the near optimality, and the efficiency of our multimethod over other state-of-the-art methods.

DOI: 10.1137/050631574

Extracted Key Phrases

11 Figures and Tables

Statistics

051015'05'06'07'08'09'10'11'12'13'14'15'16'17
Citations per Year

60 Citations

Semantic Scholar estimates that this publication has 60 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Stathopoulos2007NearlyOP, title={Nearly Optimal Preconditioned Methods for Hermitian Eigenproblems under Limited Memory. Part I: Seeking One Eigenvalue}, author={Andreas T Stathopoulos}, journal={SIAM J. Scientific Computing}, year={2007}, volume={29}, pages={481-514} }