Additive Schwarz Methods for Convex Optimization as Gradient Methods

@article{Park2020AdditiveSM,
  title={Additive Schwarz Methods for Convex Optimization as Gradient Methods},
  author={Jongho Park},
  journal={ArXiv},
  year={2020},
  volume={abs/1912.03617}
}
  • Jongho Park
  • Published 8 December 2019
  • Computer Science
  • ArXiv
This paper gives a unified convergence analysis of additive Schwarz methods for general convex optimization problems. Resembling the fact that additive Schwarz methods for linear problems are preco... 
Preconditioning for finite element methods with strain smoothing
TLDR
This work analyzes the spectrums of the stiffness matrices of the edge-based S-FEM and the SSE method and proposes an improved two-level additive Schwarz preconditioner for the strain smoothing methods by modifying local solvers appropriately.
Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart
TLDR
The proposed acceleration scheme for additive Schwarz methods does not require any a priori information on the levels of smoothness and sharpness of a target energy functional, so that it can be applied to various convex optimization problems.
Additive Schwarz Methods for Convex Optimization with Backtracking
Fast gradient methods for uniformly convex and weakly smooth problems
TLDR
Different from the existing works, fast gradient methods proposed in this paper do not use the restarting technique but use momentums that are suitably designed to reflect both the uniform convexity and weak smoothness information of the target energy function.
A dual‐primal finite element tearing and interconnecting method for nonlinear variational inequalities utilizing linear local problems
We propose a novel dual‐primal finite element tearing and interconnecting method for nonlinear variational inequalities. The proposed method is based on a particular Fenchel–Rockafellar dual

References

SHOWING 1-10 OF 37 REFERENCES
Convergence Rate of a Schwarz Multilevel Method for the Constrained Minimization of Nonquadratic Functionals
  • L. Badea
  • Mathematics, Computer Science
    SIAM J. Numer. Anal.
  • 2006
TLDR
It is proved that the convergence of a subspace correction method applied to the constrained minimization of a functional in a general reflexive Banach space has been proved, provided that the convex set verifies a certain assumption.
Global and uniform convergence of subspace correction methods for some convex optimization problems
This paper gives some global and uniform convergence estimates for a class of subspace correction (based on space decomposition) iterative methods applied to some unconstrained convex optimization
Convergence Rate Analysis of a Multiplicative Schwarz Method for Variational Inequalities
TLDR
A linear convergence is derived for the Schwarz overlapping domain decomposition method when applied to constrained minimization problems and is presented to confirm the convergence estimate derived.
Domain decomposition methods : algorithms and theory
The purpose of this text is to offer a comprehensive and self-contained presentation of some of the most successful and popular domain decomposition preconditioners for finite and spectral element
One- and two-level Schwarz methods for variational inequalities of the second kind and their application to frictional contact
TLDR
Using subspace correction methods for the solution of variational inequalities of the second kind, this work can overcome the mesh dependence of some fixed-point schemes which are commonly employed for contact problems with Coulomb friction.
Rate of Convergence for some constraint decomposition methods for nonlinear variational inequalities
  • X. Tai
  • Computer Science, Mathematics
    Numerische Mathematik
  • 2003
TLDR
Some general subspace correction algorithms for obstacle problems by multilevel domain decomposition and multigrid methods and a special nonlinear interpolation operator is introduced for decomposing the functions.
Fast Nonoverlapping Block Jacobi Method for the Dual Rudin-Osher-Fatemi Model
TLDR
It is shown that the nonoverlapping relaxed block Jacobi method for a dual formulation of the ROF model has the $O(1/n)$ convergence rate of the energy functional, where $n$ is the number of iterations and the proposed method converges faster than existing domain decomposition methods both theoretically and practically.
Convergence Rate of Overlapping Domain Decomposition Methods for the Rudin-Osher-Fatemi Model Based on a Dual Formulation
TLDR
The main objective of this paper is to rigorously analyze the convergence of the SSC and PSC algorithms and derive the rate of convergence O(n −1/2 ), where n is the number of iterations and the explicit dependence of the convergence rate on the subdomain overlapping size and other important parameters is characterized.
A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
TLDR
This paper considers regularized block multiconvex optimization, where the feasible set and objective function are generally nonconvex but convex in each block of variables and proposes a generalized block coordinate descent method.
...
...