Corpus ID: 237940709

A general alternating-direction implicit framework with Gaussian process regression parameter prediction for large sparse linear systems

  title={A general alternating-direction implicit framework with Gaussian process regression parameter prediction for large sparse linear systems},
  author={Kai Jiang and Xuehong Su and Juan Zhang},
  • Kai Jiang, Xuehong Su, Juan Zhang
  • Published 25 September 2021
  • Computer Science, Mathematics
  • ArXiv
This paper proposes an efficient general alternating-direction implicit (GADI) framework for solving large sparse linear systems. The convergence property of the GADI framework is discussed. Most of the existing ADI methods can be viewed as particular schemes of the developed framework. Meanwhile the GADI framework can derive new ADI methods. Moreover, as the algorithm efficiency is sensitive to the splitting parameters, we offer a data-driven approach, the Gaussian process regression (GPR… Expand

Figures and Tables from this paper


A Class of Nested Iteration Schemes for Linear Systems with a Coefficient Matrix with a Dominant Positive Definite Symmetric Part
These new schemes are actually inner/outer iterations, which employ the classical conjugate gradient method as inner iteration to approximate each outer iterate, while each outer iteration is induced by a convergent and symmetric positive definite splitting of the coefficient matrix. Expand
A Cyclic Low-Rank Smith Method for Large Sparse Lyapunov Equations
  • Thilo Penzl
  • Mathematics, Computer Science
  • SIAM J. Sci. Comput.
  • 1999
The cyclic low-rank Smith method is presented, which is an iterative method for the computation of low- rank approximations to the solution of large, sparse, stable Lyapunov equations, and a heuristic for determining a set of suboptimal alternating direction implicit (ADI) shift parameters is proposed. Expand
Gaussian Processes for Machine Learning (GPML) Toolbox
The GPML toolbox provides a wide range of functionality for Gaussian process (GP) inference and prediction, including exact and variational inference, Expectation Propagation, and Laplace's method dealing with non-Gaussian likelihoods and FITC for dealing with large regression tasks. Expand
Iterative Solution of Cyclically Reduced Systems Arising from Discretization of the Three-Dimensional Convection-Diffusion Equation
It is shown that performing one step of cyclic reduction, followed by reordering of the unknowns, yields a system of equations for which the block Jacobi method generally converges faster than for the original system, using lexicographic ordering. Expand
A preconditioned nested splitting conjugate gradient iterative method for the large sparse generalized Sylvester equation
A nested splitting conjugate gradient method and a preconditioned NSCG (PNSCG) iterative method are presented for solving the generalized Sylvester equation with large sparse coefficient matrices, respectively, which show that the PNSCG method is more accurate, robust and effective than the N SCG method. Expand
On successive-overrelaxation acceleration of the Hermitian and skew-Hermitian splitting iterations
Theoretical analyses show that the NSS method converges unconditionally to the exact solution of the system of linear equations, and an upper bound of the contraction factor is derived which is dependent solely on the spectrum of the normal splitting matrix, and is independent of the eigenvectors of the matrices involved. Expand
Preconditioned HSS methods for the solution of non-Hermitian positive definite linear systems and applications to the discrete convection-diffusion equation
Summary.We study the role of preconditioning strategies recently developed for coercive problems in connection with a two-step iterative method, based on the Hermitian skew-Hermitian splitting (HSS)Expand
Douglas-Rachford Splitting and ADMM for Nonconvex Optimization: Tight Convergence Results
It is shown how the Douglas-Rachford envelope (DRE), introduced in 2014, can be employed to unify and considerably simplify the theory for devising global convergence guarantees for ADMM, DRS and PRS applied to nonconvex problems under less restrictive conditions, larger prox-stepsizes and over-relaxation parameters than previously known. Expand
Preconditioned modified Hermitian and skew-Hermitian splitting iteration methods for fractional nonlinear Schrödinger equations
Abstract A variant of preconditioned modified Hermitian and skew-Hermitian splitting (PMHSS) iteration method is proposed for a class of Toeplitz-like complex linear equations arising from the spaceExpand
On the Douglas—Rachford splitting method and the proximal point algorithm for maximal monotone operators
This paper shows, by means of an operator called asplitting operator, that the Douglas—Rachford splitting method for finding a zero of the sum of two monotone operators is a special case of the proximal point algorithm, which allows the unification and generalization of a variety of convex programming algorithms. Expand