• Corpus ID: 238856923

A more direct and better variant of New Q-Newton's method Backtracking for m equations in m variables

  title={A more direct and better variant of New Q-Newton's method Backtracking for m equations in m variables},
  author={Tuyen Trung Truong},
  • T. Truong
  • Published 14 October 2021
  • Computer Science, Mathematics
  • ArXiv
In some (joint) recent papers, the authors have developed a new family of modifications of Newton’s method, for which Backtracking line search can be incorporated, for optimization. The new method, called New Q-Newton’s method (and its Backtracking version), has good theoretical guarantee (concerning convergence to critical points, avoidance of saddle points and rate of convergence). This method can be used to solve a system of equations g1 = . . . = gN = 0, by applying to the function f = g 1… 


Iterative methods for optimization
  • C. Kelley
  • Mathematics, Computer Science
    Frontiers in applied mathematics
  • 1999
Iterative Methods for Optimization does more than cover traditional gradient-based optimization: it is the first book to treat sampling methods, including the Hooke& Jeeves, implicit filtering, MDS, and Nelder& Mead schemes in a unified way.
On the Quadratic Convergence of the Levenberg-Marquardt Method without Nonsingularity Assumption
If ||F(x)|| provides a local error bound for the system of nonlinear equations F(x)=0, it is shown that the sequence {xk} generated by the new method converges to a solution quadratically, which is stronger than dist(xk,X*)→0 given by Yamashita and Fukushima.
Local convergence of the Levenberg–Marquardt method under Hölder metric subregularity
An adaptive formula for the Levenberg–Marquardt parameter is proposed and the local convergence of the method is analyzed under Hölder metric subregularity of the function defining the equation and H Ölder continuity of its gradient mapping.
Modified inexact Levenberg–Marquardt methods for solving nonlinear least squares problems
Preliminary numerical experiments show that the proposed modified inexact Levenberg–Marquardt method and its global version outperforms the classical inexact LMM on nonlinear least squares problems (NLSP), especially for the underdetermined case.
On the Rate of Convergence of the Levenberg-Marquardt Method
We consider a rate of convergence of the Levenberg-Marquardt method (LMM) for solving a system of nonlinear equations F(x) = 0, where F is a mapping from Rn into Rm. It is well-known that LMM has a
Communication: Newton homotopies for sampling stationary points of potential energy landscapes.
This work proposes an efficient implementation of Newton homotopies, which can sample a large number of the stationary points of complicated many-body potentials and demonstrates how the procedure works by applying it to the nearest-neighbor ϕ(4) model and atomic clusters.
On a problem posed by Steve Smale
The 17th of the problems proposed by Steve Smale for the 21st century asks for the existence of a deterministic algorithm computing an approximate solution of a system of n complex polynomials in n
Every algebraic set inn-space is the intersection ofn hypersurfaces
The Göttingen State and University Library provides access to digitized documents strictly for noncommercial educational, research and private purposes and makes no warranty with regard to their use
Smale’s 17th problem: Average polynomial time to compute affine and projective solutions
Smale’s 17th Problem asks: “Can a zero of n complex polynomial equations in n unknowns be found approximately, on the average, in polynomial time with a uniform algorithm?”. We give a positive answer
Some convergent results for Backtracking Gradient Descent method on Banach spaces
The main result concerns the following condition, which is that whenever whenever $\{x_n\}$ weakly converges to $x$ and $\lim _{n\rightarrow\infty}||\nabla f(x)|=0', then $f(x)=0$.