Wolfe conditions

Known as: Goldstein conditions, Wolfe condition 
In the unconstrained minimization problem, the Wolfe conditions are a set of inequalities for performing inexact line search, especially in quasi… (More)
Wikipedia

Topic mentions per year

Topic mentions per year

1997-2018
024619972018

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2016
2016
This article describes a new Riemannian conjugate gradient method and presents a global convergence analysis. The existing… (More)
  • figure 1
  • figure 2
  • table 1
  • table 2
  • table 3
Is this relevant?
2015
2015
In deterministic optimization problems, line search routines are a standard tool ensuring stability and efficiency. In the… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?
Highly Cited
2013
Highly Cited
2013
We investigate the behavior of quasi-Newton algorithms applied to minimize a nonsmooth function f , not necessarily convex. We… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
2009
2009
It is well known that global convergence has not been established for the Polak–Ribière– Polyak (PRP) conjugate gradient method… (More)
  • table 1
  • table 2
  • table 3
  • figure 1
Is this relevant?
2007
2007
This paper proposes a line search technique to satisfy a relaxed form of the strong Wolfe conditions in order to guarantee the… (More)
  • table 1
  • table 2
  • table 3
Is this relevant?
Highly Cited
2006
Highly Cited
2006
Recently, a new nonlinear conjugate gradient scheme was developed which satisfies the descent condition <b>g</b><sup>T</sup><sub… (More)
  • figure 1
  • table II
  • figure 3
  • table III
  • figure 4
Is this relevant?
Highly Cited
2005
Highly Cited
2005
A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and… (More)
  • figure 4.1
  • figure 4.2
  • figure 4.3
  • figure 4.4
  • figure 5.1
Is this relevant?
2002
2002
In this work, an efficient training algorithm for feedforward neural networks is presented. It is based on a scaled version of… (More)
  • table 1
  • table 2
  • table 3
  • table 4
Is this relevant?
Highly Cited
1999
Highly Cited
1999
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. However, the strong… (More)
Is this relevant?
1997
1997
This paper describes a reduced quasi-Newton method for solving equality constrained optimization problems. A major difficulty… (More)
  • figure 3.1
  • figure 3.2
  • figure 3.3
  • figure 3.4
  • table 5.1
Is this relevant?