A sequential homotopy method for mathematical programming problems

@article{Potschka2021ASH,
  title={A sequential homotopy method for mathematical programming problems},
  author={Andreas Potschka and Hans Georg Bock},
  journal={Mathematical Programming},
  year={2021},
  volume={187},
  pages={459-486}
}
We propose a sequential homotopy method for the solution of mathematical programming problems formulated in abstract Hilbert spaces under the Guignard constraint qualification. The method is equivalent to performing projected backward Euler timestepping on a projected gradient/antigradient flow of the augmented Lagrangian. The projected backward Euler equations can be interpreted as the necessary optimality conditions of a primal-dual proximal regularization of the original problem. The… 
A Preconditioned Inexact Active-Set Method for Large-Scale Nonlinear Optimal Control Problems
TLDR
A global convergence proof of the recently proposed sequential homotopy method with an inexact Krylov–semismooth-Newton method employed as a local solver and an efficient, parallelizable, symmetric positive definite preconditioner based on a double Schur complement approach is provided.
A Flow Perspective on Nonlinear Least-Squares Problems
Just as the damped Newton method for the numerical solution of nonlinear algebraic problems can be interpreted as a forward Euler timestepping on the Newton flow equations, the damped Gauß–Newton
A Note On Symmetric Positive Definite Preconditioners for Multiple Saddle-Point Systems
TLDR
A preconditioner is described for multiple saddle-point systems of block tridiagonal form which can be applied within the Minres algorithm, and which has only two distinct eigenvalues, 1 and −1, when the preconditionser is applied exactly.
Constrained Structured Optimization and Augmented Lagrangian Proximal Methods
TLDR
It is demonstrated how the inner subproblems can be solved by o ff -the-shelf methods for composite optimization, without introducing slack variables and despite the appearance of set-valued projections.

References

SHOWING 1-10 OF 72 REFERENCES
Constrained Optimization: Projected Gradient Flows
We consider a dynamical system approach to solve finite-dimensional smooth optimization problems with a compact and connected feasible set. In fact, by the well-known technique of equalizing
Backward step control for Hilbert space problems
TLDR
The results include global convergence to a distinctive solution characterized by propagating the initial guess by a generalized Newton flow with guaranteed bounds on the discrete nonlinear residual norm decrease and an (also numerically) easily controllable asymptotic linear residual convergence rate.
Newton-Picard Preconditioners for Time-Periodic Parabolic Optimal Control Problems
TLDR
The theory of semigroups in conjunction with spectral decompositions of their generators is used to derive detailed representation formulas for shooting operators in function space and their adjoints and shows that this preconditioner leads to convergence in a function space fixed-point iteration.
Semismooth Newton Methods for Variational Inequalities and Constrained Optimization Problems in Function Spaces
  • M. Ulbrich
  • Mathematics
    MOS-SIAM Series on Optimization
  • 2011
TLDR
The author covers adjoint-based derivative computation and the efficient solution of Newton systems by multigrid and preconditioned iterative methods.
Lagrange multiplier approach to variational problems and applications
TLDR
This comprehensive monograph analyzes Lagrange multiplier theory and shows its impact on the development of numerical algorithms for problems posed in a function space setting and develops and analyze efficient algorithms for constrained optimization and convex optimization problems based on the augumented Lagrangian concept.
Projected gradient methods for linearly constrained problems
TLDR
It is shown that it is possible to develop a finite terminating quadratic programming algorithm without non-degeneracy assumptions and to apply these results to algorithms for linearly constrained problems.
Global Inexact Newton Multilevel FEM for Nonlinear Elliptic Problems
TLDR
An affine conjugate global convergence theory is given, which covers both the exact Newton method and inexact Newton-Galerkin methods addressing the crucial issue of accuracy matching between discretization and iteration errors.
Backward Step Control for Global Newton-Type Methods
TLDR
A new damping approach called backward step control for the globalization of the convergence of Newton-type methods for the numerical solution of nonlinear root-finding problems, which can guarantee a transition to full steps in the vicinity of a solution, which implies fast local convergence.
Newton-Picard-Based Preconditioning for Linear-Quadratic Optimization Problems with Time-Periodic Parabolic PDE Constraints
TLDR
This work develops and investigates two preconditioners for a basic linear iterative splitting method for the numerical solution of linear-quadratic optimization problems with time-periodic parabolic parabolic PDE constraints, and proves mesh-independent convergence for the classical Newton-Picard preconditionser.
Differential variational inequalities
TLDR
This paper introduces and studies the class of differential variational inequalities (DVIs) in a finite-dimensional Euclidean space, and establishes the convergence of such a procedure for solving initial-value DVIs.
...
1
2
3
4
5
...