Globally Convergent Primal-Dual Active-Set Methods with Inexact Subproblem Solves

  title={Globally Convergent Primal-Dual Active-Set Methods with Inexact Subproblem Solves},
  author={Frank E. Curtis and Zheng Han},
  journal={SIAM J. Optim.},
We propose primal-dual active-set (PDAS) methods for solving large-scale instances of an important class of convex quadratic optimization problems (QPs). The iterates of the algorithms are partitions of the index set of variables, where corresponding to each partition there exist unique primal-dual variables that can be obtained by solving a (reduced) linear system. Algorithms of this type have recently received attention when solving certain QPs and linear complementarity problems since, with… 

A dual gradient-projection method for large-scale strictly convex quadratic problems

The details of a solver for minimizing a strictly convex quadratic objective function subject to general linear constraints are presented and how the linear algebra may be arranged to take computational advantage of sparsity in the second-derivative matrix is shown.

Primal-Dual Active-Set Methods for Isotonic Regression and Trend Filtering

It is proved that, like the PAV algorithm, the proposed primal-dual active-set (PDAS) algorithm for IR is convergent and has a work complexity of O(n), though the numerical experiments suggest that the PDAS algorithm is often faster than PAV.

Primal-Dual Active-Set Methods for Convex Quadratic Optimization with Applications




A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization

This work presents a primal-dual active-set framework for solving large-scale convex quadratic optimization problems (QPs) and explains the relationship between this framework and semi-smooth Newton techniques, finding that this approach is globally convergent for strictly convex QPs.

Interior-Point Solver for Large-Scale Quadratic Programming Problems with Bound Constraints

An interior-point algorithm for large and sparse convex quadratic programming problems with bound constraints based on the potential reduction method and the use of iterative techniques to solve the linear system arising at each iteration is presented.

An Infeasible Active Set Method for Quadratic Problems with Simple Bounds

A primal-dual active set method for quadratic problems with bound constraints is presented, based on a guess on the active set, that satisfies the first order optimality condition and the complementarity condition.

QPSchur: A dual, active-set, Schur-complement method for large-scale and structured convex quadratic programming

An active-set, dual-feasible Schur-complement method for quadratic programming (QP) with positive definite Hessians, and it is shown that the use of fixed-precision iterative refinement helps to dramatically improve the numerical stability of this Schur complement algorithm.

A numerically stable dual method for solving strictly convex quadratic programs

An efficient and numerically stable dual algorithm for positive definite quadratic programming is described which takes advantage of the fact that the unconstrained minimum of the objective function

The Primal-Dual Active Set Strategy as a Semismooth Newton Method

The notion of slant differentiability is recalled and it is argued that the $\max$-function is slantly differentiable in Lp-spaces when appropriately combined with a two-norm concept, which leads to new local convergence results of the primal-dual active set strategy.

Projected Newton methods for optimization problems with simple constraints

  • D. Bertsekas
  • Mathematics, Computer Science
    1981 20th IEEE Conference on Decision and Control including the Symposium on Adaptive Processes
  • 1981
It is shown that Dk can be calculated simply on the basis of second derivatives of f so that the resulting Newton-like algorithm has a typically superlinear rate of convergence.

An Infeasible Primal-Dual Algorithm for Total Bounded Variation-Based Inf-Convolution-Type Image Restoration

The globalized primal-dual algorithm introduced in this paper works with generalized derivatives, converges locally at a superlinear rate, and is stable with respect to noise in the data.

Algorithms for bound constrained quadratic programming problems

SummaryWe present an algorithm which combines standard active set strategies with the gradient projection method for the solution of quadratic programming problems subject to bounds. We show, in

Superlinear and quadratic convergence of affine-scaling interior-point Newton methods for problems with simple bounds without strict complementarity assumption

Abstract.A class of affine-scaling interior-point methods for bound constrained optimization problems is introduced which are locally q–superlinear or q–quadratic convergent. It is assumed that the