#### Filter Results:

- Full text PDF available (45)

#### Publication Year

1968

2016

- This year (0)
- Last 5 years (22)
- Last 10 years (38)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- M. J. D. Powell
- Math. Program.
- 1977

We are concerned in this paper with the general problem of finding an unrestricted local minimum of a function J[xu x2 • • •, xn) of several variables xx, x2, • •., xn. We suppose that the function of interest can be calculated at all points. It is convenient to group functions into two main classes according to whether the gradient vector g, = Wbx, is… (More)

- M. J. D. Powell, Ya-Xiang Yuan
- Math. Program.
- 1991

- M. J. D. Powell, Malcolm A. Sabin
- ACM Trans. Math. Softw.
- 1977

The problem of constructing a function 4,(x, y) of two variables on a triangle, such that ~(x, y) and its first derivatives take given values at the vertices, where ¢(x, y) is composed of quadratic pieces, is considered. Two methods of constructing piecewise quadratic approximations are described which have the property that, if they are applied on each… (More)

- M. J. D. Powell
- Math. Program.
- 2002

UOBYQA is a new algorithm for general unconstrained optimization calculations, that takes account of the curvature of the objective function, F say, by forming quadratic models by interpolation. Therefore, because no rst derivatives are required, each model is deened by 1 2 (n+1)(n+2) values of F, where n is the number of variables, and the interpolation… (More)

- M. J. D. Powell
- Math. Program.
- 1978

Lagrangian functions are the basis of many of the more successful methods for nonlinear constraints in optimization calculations. Sometimes they are used in conjunction with linear approximations to the constraints and sometimes penalty terms are included to allow the use of algorithms for unconstrained optimization. Much has been discovered about these… (More)

- M. J. D. Powell
- Math. Program.
- 1984

Many trust region algorithms for unconstrained minimization have excellent global convergence properties if their second derivative approximations are not too large [2]. We consider how large these approximations have to be, if they prevent convergence when the objective function is bounded below and continuously differentiable. Thus we obtain a useful… (More)

- M. J. D. Powell
- Math. Program.
- 2003

We consider some algorithms for unconstrained minimization without derivatives that form linear or quadratic models by interpolation to values of the objective function. Then a new vector of variables is calculated by minimizing the current model within a trust region. Techniques are described for adjusting the trust region radius, and for choosing… (More)

- M. J. D. Powell
- Math. Program.
- 2004

Quadratic models of objective functions are highly useful in many optimization algorithms. They are updated regularly to include new information about the objective function, such as the diierence between two gradient vectors. We consider the case, however, when each model interpolates some function values, so an update is required when a new function value… (More)

- M. J. D. Powell
- 1998

Many diierent procedures have been proposed for optimization calculations when rst derivatives are not available. Further, several researchers have contributed to the subject, including some who wish to prove convergence theorems , and some who wish to make any reduction in the least calculated value of the objective function. There is not even a key idea… (More)