A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles

  title={A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles},
  author={Frank E. Curtis and Tim Mitchell and Michael L. Overton},
  journal={Optimization Methods and Software},
  pages={148 - 181}
We propose an algorithm for solving nonsmooth, nonconvex, constrained optimization problems as well as a new set of visualization tools for comparing the performance of optimization algorithms. Our algorithm is a sequential quadratic optimization method that employs Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton Hessian approximations and an exact penalty function whose parameter is controlled using a steering strategy. While our method has no convergence guarantees, we have found it to… 

A Comparison of Nonsmooth, Nonconvex, Constrained Optimization Solvers for the Design of Time-Delay Compensators

We present a detailed set of performance comparisons of two state-of-the-art solvers for the application of designing time-delay compensators, an important problem in the field of robust control.

An Optimization algorithm for nonsmooth nonconvex problems with upper-C^2 objective

The algorithm handles general smooth constraints similarly to sequential quadratic programming (SQP) methods and uses a line search to ensure progress and the potential inconsistencies from the linearization of the constraints are addressed through a penalty method.

A simplified nonsmooth nonconvex bundle method with applications to security-constrained ACOPF problems

It is shown that quadratic penalty method for security-constrained alternating current optimal power (SCACOPF) contingency problems can make the contingency solution functions upper- C 2 .

An SQP method for minimization of locally Lipschitz functions with nonlinear constraints

It is proved that this quadratic model for minimizing problems with nonconvex and nonsmooth objective and constraint functions is globally convergent in the sense that, every accumulation point of the generated sequence is a Clark-stationary point for the penalty function.

Behavior of Limited Memory BFGS When Applied to Nonsmooth Functions and Their Nesterov Smoothings

It is found that when applied to a nonsmooth function directly, L-BFGS, especially its scaled variant, often breaks down with a poor approximation to an optimal solution, in sharp contrast to full BFGS, and it is often better to apply L- BFGS to a smooth approximation of a nonssooth problem than to apply it directly to the nonsm Smooth optimization problem.

Behavior of the Limited-Memory BFGS Method on Nonsmooth Optimization Problems

The limited memory BFGS (Broyden-Fletcher-Goldfarb-Shanno) method, abbreviated L-BFGS, is widely used for large-scale unconstrained optimization, but its behavior on nonsmooth problems has received

A Nonsmooth Trust-Region Method for Locally Lipschitz Functions with Application to Optimization Problems Constrained by Variational Inequalities

A computable trust-region model is proposed which fulfills the convergence hypotheses of the general algorithm and is able to properly characterize the Bouligand subdifferential of the reduced cost function for variational inequality constrained problems.

Gradient Sampling Methods for Nonsmooth Optimization

The simplicity of gradient sampling as an extension of the steepest descent method for minimizing smooth objectives is emphasized and various enhancements that have been proposed to improve practical performance are provided.

Robust Design of a Smart Structure under Manufacturing Uncertainty via Nonsmooth PDE-Constrained Optimization

This work considers the problem of finding the optimal shape of a force-sensing element which is integrated into a tubular structure and solves the problem with a BFGS--SQP method for nonsmooth problems recently proposed by Curtis, Mitchell and Overton.

Multi-fidelity robust controller design with gradient sampling

Numerical experiments with controlling the cooling of a steel rail profile and laminar wake in a cylinder wake demonstrate that the new multi-fldelity gradient sampling methods achieve up to two orders of magnitude speedup compared to the single-ﷁdelite gradient sampling method that relies on the high- fidelity model alone.



A Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization

A line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable on open dense subsets of $\mathbb{R}^{n}$.

SNOPT: An SQP Algorithm for Large-Scale Constrained Optimization

An SQP algorithm that uses a smooth augmented Lagrangian merit function and makes explicit provision for infeasibility in the original problem and the QP subproblems is discussed and a reduced-Hessian semidefinite QP solver (SQOPT) is discussed.

Robust and Efficient Methods for Approximation and Optimization of Stability Measures

Two new algorithms with practical application to the problem of designing controllers for linear dynamical systems with input and output are considered: a new spectral value set based algorithm called hybrid expansion-contraction intended for approximating the H∞ norm and a new BFGS SQP based optimization method for nonsmooth, nonconvex constrained optimization motivated by multi-objective controller design.

A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization

A practical, robust algorithm to locally minimize such functions as f, a continuous function on $\Rl^n$, which is continuously differentiable on an open dense subset, based on gradient sampling is presented.

Globally convergent limited memory bundle method for large-scale nonsmooth optimization

A new variant of this method is introduced and it is proved its global convergence for locally Lipschitz continuous objective functions, which are not necessarily differentiable or convex.

Convergence of the Gradient Sampling Algorithm for Nonsmooth Nonconvex Optimization

A slightly revised version of the gradient sampling algorithm of Burke, Lewis, and Overton for minimizing a locally Lipschitz function on $\mathbb{R}^n$ that is continuously differentiable on an open dense subset is introduced.

HIFOO - A MATLAB package for fixed-order controller design and H ∞ optimization

Numerical experiments on benchmark problem instances indicate that HIFOO could be an efficient and reliable computer-aided control system design (CACSD) tool, with a potential for realistic industrial applications.

A line search exact penalty method using steering rules

An exact penalization approach is described that extends the class of problems that can be solved with line search sequential quadratic programming methods and it is shown that the algorithm enjoys favorable global convergence properties.

Active Sets, Nonsmoothness, and Sensitivity

It is shown under a natural regularity condition that critical points of partly smooth functions are stable: small perturbations to the function cause small movements of the critical point on the active manifold.