Michael J. D. Powell. 29 July 1936—19 April 2015

@article{Buhmann2018MichaelJD,
  title={Michael J. D. Powell. 29 July 1936—19 April 2015},
  author={Martin D. Buhmann and Roger Fletcher and Arieh Iserles and Philippe L. Toint},
  journal={Biographical Memoirs of Fellows of the Royal Society},
  year={2018},
  pages={341 - 366}
}
Michael James David Powell was a British numerical analyst who was among the pioneers of computational mathematics. During a long and distinguished career, first at the Atomic Energy Research Establishment (AERE) Harwell and subsequently as the John Humphrey Plummer Professor of Applied Numerical Analysis in Cambridge, he contributed decisively towards establishing optimization theory as an effective tool of scientific enquiry, replete with highly effective methods and mathematical… 

Figures from this paper

References

SHOWING 1-10 OF 62 REFERENCES

Direct search algorithms for optimization calculations

TLDR
Line search methods, the restriction of vectors of variables to discrete grids, the use of geometric simplices, conjugate direction procedures, trust region algorithms that form linear or quadratic approximations to the objective function, and simulated annealing are addressed.

On the number of iterations of Karmarkar's algorithm for linear programming

TLDR
It is proved that the solution of the example by Karmarkar's original algorithm can require aboutn/20 iterations, and it is found that the algorithm makes changes to the variables that are closely related to the steps of the simplex method.

Beyond symmetric Broyden for updating quadratic models in minimization without derivatives

TLDR
An extension of this technique that combines changes in first derivatives with changes in second derivatives is considered, which allows very high accuracy to be achieved in practice when F is a homogeneous quadratic function.

Karmarkar's algorithm : a view from nonlinear programming

TLDR
This work describes and studies Karmarkar's algorithm for linear programming in the usual way, and finds a way of applying the algorithm directly to linear programming problems in general form, where there is no need to increase the original number of variables even when there are very many constraints.

A View of Algorithms for Optimization without Derivatives 1

TLDR
It is found that least values of functions of more than 100 variables can be calculated, and attention is given to methods that are suitable for noisy functions, and that change the variables in ways that are not random.

On the use of quadratic models in unconstrained minimization without derivatives

  • M. Powell
  • Computer Science
    Optim. Methods Softw.
  • 2004
TLDR
This work addresses the construction of suitable quadratic models Q by interpolating values of the objective function F, which minimizes the Frobenius norm of the change to ∇2 Q, subject to the interpolation conditions that have been mentioned.

A Direct Search Optimization Method That Models the Objective and Constraint Functions by Linear Interpolation

TLDR
An iterative algorithm for nonlinearly constrained optimization calculations when there are no derivatives, where a new vector of variables is calculated, which may replace one of the current vertices, either to improve the shape of the simplex or because it is the best vector that has been found so far.

On the convergence of trust region algorithms for unconstrained minimization without derivatives

TLDR
It is proved that, if F is bounded below, if ∇2F is also bounded, and if the number of iterations is infinite, then the sequence of gradients of F, k, converges to zero, where $\underline{x}_{\,k}$ is the centre of the trust region of the k-th iteration.
...