# The BOBYQA algorithm for bound constrained optimization without derivatives

@inproceedings{Powell2009TheBA, title={The BOBYQA algorithm for bound constrained optimization without derivatives}, author={M. J. D. Powell}, year={2009} }

BOBYQA is an iterative algorithm for finding a minimum of a function F(x), x2R n , subject to bounds axb on the variables, F being specified by a "black box" that returns the value F(x) for any feasible x. Each iteration employs a quadratic approximation Q to F that satisfies Q(y j )= F(y j ), j =1 ,2,...,m, the interpolation points y j being chosen and adjusted automatically, but m is a prescribed constant, the value m =2 n+1 being typical. These conditions leave much freedom in Q, taken up…

## 1,074 Citations

A derivative-free algorithm for linearly constrained optimization problems

- Mathematics, Computer ScienceComput. Optim. Appl.
- 2014

Numerical results are presented which show that LCOBYQA works well and is very competing against available model-based derivative-free algorithms.

Beyond symmetric Broyden for updating quadratic models in minimization without derivatives

- MathematicsMath. Program.
- 2013

An extension of this technique that combines changes in first derivatives with changes in second derivatives is considered, which allows very high accuracy to be achieved in practice when F is a homogeneous quadratic function.

A trust-region derivative-free algorithm for constrained optimization

- Computer ScienceOptim. Methods Softw.
- 2015

A trust-region algorithm for constrained optimization problems in which the derivatives of the objective function are not available that is approximated by a model obtained by quadratic interpolation, which is then minimized within the intersection of the feasible set with the trust region.

Efficient global unconstrained black box optimization

- Computer Science
- 2018

For the unconstrained optimization of black box functions, this paper presents a new stochastic algorithm called VSBBO. In practice, VSBBO matches the quality of other state-of-the-art algorithms for…

Escaping local minima with derivative-free methods: a numerical investigation

- Computer Science
- 2018

It is found numerically that Py-BOBYQA is competitive with global optimization solvers for all accuracy/budget regimes, in both smooth and noisy settings, and best performing for smooth and multiplicative noise problems in high-accuracy regimes.

Hermite-type modifications of BOBYQA for optimization with some partial derivatives

- Mathematics
- 2022

In this work we propose two Hermite-type optimization methods, Hermite least squares and Hermite BOBYQA, specialized for the case that some partial derivatives of the objective function are available…

UNIPOPT: Univariate projection-based optimization without derivatives

- Mathematics, Computer ScienceComput. Chem. Eng.
- 2019

Global convergence of a derivative-free inexact restoration filter algorithm for nonlinear programming

- Computer Science
- 2017

This work presents an algorithm for solving constrained optimization problems that does not make explicit use of the objective function derivatives, and proves that the full steps are efficient in the sense that near a feasible nonstationary point, the decrease in the objectivefunction is relatively large, ensuring the global convergence results of the algorithm.

A derivative-free comirror algorithm for convex optimization

- Mathematics, Computer ScienceOptim. Methods Softw.
- 2015

It is shown that, if the sampling radii for linear interpolation are properly selected, then the new algorithm has the same convergence rate as the original gradient-based algorithm, providing a novel global rate-of-convergence result for nonsmooth convex DFO with nonsm Smooth convex constraints.

DIRECT using local search on surrogates

- Computer Science
- 2011

A derivative free optimization algorithm is developed that combines advantages of both local and global methods and determines better candidates for sampling than the hypercube center points chosen by DIRECT, especially if constraints are arising.

## References

SHOWING 1-5 OF 5 REFERENCES

The NEWUOA software for unconstrained optimization without derivatives

- Computer Science
- 2006

The NEWUOA software seeks the least value of a function F( x), x∈Rn, when F(x) can be calculated for any vector of variables x, and a quadratic model Q≈F being required at the beginning of each iteration, which is used in a trust region procedure for adjusting the variables.

Developments of NEWUOA for minimization without derivatives

- Computer Science
- 2008

he NEWUOA software is described briefly, with some numerical results that show good efficiency and accuracy in the unconstrained minimization without derivatives of functions of up to 320 variables.…

Least Frobenius norm updating of quadratic models that satisfy interpolation conditions

- MathematicsMath. Program.
- 2004

A method is presented for updating all these coefficients of quadratic Lagrange functions of the current interpolation problem in ({m+n}2) operations, which allows the model to be updated too and has a useful stability property that is investigated in some numerical experiments.

Introduction to DerivativeFree Optimization, SIAM Publications (Philadelphia)

- 2009