Iterative methods for sparse linear systems

@inproceedings{Saad2003IterativeMF,
  title={Iterative methods for sparse linear systems},
  author={Yousef Saad},
  year={2003}
}
  • Y. Saad
  • Published 1 May 2003
  • Computer Science
Preface 1. Background in linear algebra 2. Discretization of partial differential equations 3. Sparse matrices 4. Basic iterative methods 5. Projection methods 6. Krylov subspace methods Part I 7. Krylov subspace methods Part II 8. Methods related to the normal equations 9. Preconditioned iterations 10. Preconditioning techniques 11. Parallel implementations 12. Parallel preconditioners 13. Multigrid methods 14. Domain decomposition methods Bibliography Index. 

Variable Block Multilevel Iterative Solution of General Sparse Linear Systems

We present numerical results with a variable block multilevel incomplete LU factorization preconditioners for solving sparse linear systems arising, e.g., from the discretization of 2D and 3D partial

Least Squares Methods in Krylov Subspaces

The paper considers iterative algorithms for solving large systems of linear algebraic equations with sparse nonsymmetric matrices based on solving least squares problems in Krylov subspaces and

LEAST SQUARES METHODS IN KRYLOV SUBSPACES

The paper considers iterative algorithms for solving large systems of linear algebraic equations with sparse nonsymmetric matrices based on solving least squares problems in Krylov subspaces and

Projection methods for nonlinear sparse eigenvalue problems

This paper surveys numerical methods for general sparse nonlinear eigenvalue problems with special emphasis on iterative projection methods like Jacobi–Davidson, Arnoldi or rational Krylov methods

Algebraic Multilevel Methods and Sparse Approximate Inverses

TLDR
A new approach to algebraic multilevel methods and their use as preconditioners in iterative methods for the solution of symmetric positive definite linear systems is introduced.

Preconditioning techniques for large linear systems: a survey

This article surveys preconditioning techniques for the iterative solution of large linear systems, with a focus on algebraic methods suitable for general sparse matrices. Covered topics include

A new approach to algebraic multilevel methods based on sparse approximate inverses

TLDR
A new approach to algebraic multilevel methods and their use as preconditioners in iterative methods for the solution of symmetric positive definite linear systems is introduced.
...

References

SHOWING 1-10 OF 202 REFERENCES

A robust parallel solver for block tridiagonal systems

TLDR
The method, block symmetric successive over-relaxation with conjugate gradient acceleration (BSSOR), is remarkably robust and when applied to block tridiagonal systems allows parallelism in the computations.

Parallelizing preconditioned conjugate gradient algorithms

Parallel Implementations of Preconditioned Conjugate Gradient Methods.

TLDR
This document considers a few different implementations of classical iterative methods on parallel processors with the purpose of studying how multiprocessor architecture affects performance, and focuses on the solution methods based GMRES, a conjugate gradient-like method, combined with well-known preconditionings.

Introduction to Parallel and Vector Solution of Linear Systems

  • J. Ortega
  • Mathematics, Computer Science
    Frontiers of Computer Science
  • 1988
TLDR
The Conjugate Gradient Algorithm and the Iterative Methods for Linear Equations are described, which simplify the derivation of linear algebra to simple linear algebra.

A stability analysis of incomplete LU factorizations

  • H. Elman
  • Computer Science, Mathematics
  • 1986
Abstract : The combination of iterative methods with preconditionings based on incomplete LU factorizations constitutes an effective class of methods for solving the sparse linear systems arising

Block Preconditioning for the Conjugate Gradient Method

TLDR
Numerical experiments on test problems for two dimensions indicate that a particularly attractive preconditioning, which uses special properties of tridiagonal matrix inverses, can be computationally more efficient for the same computer storage than other preconditionsings, including the popular point incomplete Cholesky factorization.

Matrix Iterative Analysis

Matrix Properties and Concepts.- Nonnegative Matrices.- Basic Iterative Methods and Comparison Theorems.- Successive Overrelaxation Iterative Methods.- Semi-Iterative Methods.- Derivation and

Methods of conjugate gradients for solving linear systems

TLDR
An iterative algorithm is given for solving a system Ax=k of n linear equations in n unknowns and it is shown that this method is a special case of a very general method which also includes Gaussian elimination.

Practical Use of Polynomial Preconditionings for the Conjugate Gradient Method

TLDR
A version of the conjugate gradient algorithm is formulated that is more suitable for parallel architectures and the advantages of polynomial preconditioning in the context of these architectures are discussed.

Parallel preconditioning and approximation inverses on the Connection Machine

  • M. GroteH. Simon
  • Computer Science
    Proceedings Scalable High Performance Computing Conference SHPCC-92.
  • 1992
The authors present a new approach to preconditioning for very large, sparse, non-symmetric, linear systems. It explicitly computes an approximate inverse to the original matrix that can be applied
...