The convergence of variable metric matrices in unconstrained optimization

@article{Ge1983TheCO,
  title={The convergence of variable metric matrices in unconstrained optimization},
  author={Renpu Ge and M. J. D. Powell},
  journal={Mathematical Programming},
  year={1983},
  volume={27},
  pages={123-143}
}
  • R. Ge, M. Powell
  • Published 1 October 1983
  • Mathematics
  • Mathematical Programming
It is proved that, if the DFP or BFGS algorithm with step-lengths of one is applied to a functionF(x) that has a Lipschitz continuous second derivative, and if the calculated vectors of variables converge to a point at which ∇F is zero and ∇2F is positive definite, then the sequence of variable metric matrices also converges. The limit of this sequence is identified in the case whenF(x) is a strictly convex quadratic function. 
Convergence of quasi-Newton matrices generated by the symmetric rank one update
TLDR
Conditions under which these approximations can be proved to converge globally to the true Hessian matrix are given, in the case where the Symmetric Rank One update formula is used.
The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients
TLDR
The main purpose of this paper is the extension of Powell's (1976) global convergence result to the partitioned BFGS method introduced by Griewank and Toint (1982), and a damping of the BFGS update that becomes inactive if the problem turns out to be regular nearx*.
The convergence of matrices generated by rank-2 methods from the restricted β-class of Broyden
SummaryIt is shown that the matricesBk generated by any method from the restricted β-class of Broyden converge, if the method is applied to the unconstrained minimization of a functionf∈C2(Rn) with
Rates of convergence for secant methods on nonlinear problems in hilbert space
The numerical performance of iterative methods applied to discretized operator equations may depend strongly on their theoretical rate of convergence on the underlying problem g(x)=0 in Hilbert
Solving reachability problems by a scalable constrained optimization method
TLDR
This paper investigates the problem of finding an evolution of a dynamical system that originates and terminates in given sets of states and finds a scalable approach for solving it.
A Theoretical and Experimental Study of the Symmetric Rank-One Update
TLDR
A new analysis is presented that shows that the SRi method with a line search is $( n + 1)$-step q-superlinearly convergent without the assumption of linearly independent iterates.
Sequential quadratic programming with indefinite Hessian approximations for nonlinear optimum experimental design for parameter estimation in differential–algebraic equations
TLDR
Algorithms for the numerical solution of problems from nonlinear optimum experimental design (OED) for parameter estimation in differential–algebraic equations and a filter line search globalization strategy that accepts indefinite Hessians based on a new criterion derived from the proof of global convergence are developed.
Convergence properties of the Broyden-like method for mixed linear-nonlinear systems of equations
TLDR
This is the first time that convergence of the Broyden-like matrices is proven for n > 1, albeit for a special case only and the subspace property of the iterates belongs to an affine subspace is used.
On the convergence of Broyden's method and some accelerated schemes for singular problems
TLDR
It is shown that the use of a preceding Newton–like step ensures convergence for starting points in a starlike domain with density 1, and it is established that the matrix updates of Broyden’s method converge q-linearly with the same asymptotic factor as the iterates.
Greedy and Random Broyden's Methods with Explicit Superlinear Convergence Rates in Nonlinear Equations
TLDR
This work proposes the greedy and random Broyden’s method for solving nonlinear equations, and establishes explicit (local) superlinear convergence rates of both methods if the initial point and approximate Jacobian are close enough to a solution and corresponding Jacobian.
...
1
2
3
4
...

References

SHOWING 1-9 OF 9 REFERENCES
On the Convergence of the Variable Metric Algorithm
TLDR
It is proved that successful convergence is obtained provided that the objective function has a strictly positive definite second derivative matrix for all values of its variables.
The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
TLDR
This paper presents a more detailed analysis of a class of minimization algorithms, which includes as a special case the DFP (Davidon-Fletcher-Powell) method, than has previously appeared and investigates how the successive errors depend, again for quadratic functions, upon the initial choice of iteration matrix.
Quasi-Newton Methods, Motivation and Theory
This paper is an attempt to motivate and justify quasi-Newton methods as useful modifications of Newton''s method for general and gradient nonlinear systems of equations. References are given to
The algebraic eigenvalue problem
Theoretical background Perturbation theory Error analysis Solution of linear algebraic equations Hermitian matrices Reduction of a general matrix to condensed form Eigenvalues of matrices of
A New Approach to Variable Metric Algorithms