Gradient method

Known as: Gradient (disambiguation) 
In optimization, gradient method is an algorithm to solve problems of the form with the search directions defined by the gradient of the function at… (More)
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2014
Highly Cited
2014
In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently… (More)
  • figure 1
  • figure 2
Is this relevant?
Highly Cited
2014
Highly Cited
2014
We consider the problem of minimizing the sum of two convex functions: one is the average of a large number of smooth component… (More)
  • figure 1
  • table 1
  • figure 2
  • figure 3
  • figure 4
Is this relevant?
Highly Cited
2013
Highly Cited
2013
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two… (More)
Is this relevant?
Highly Cited
2012
Highly Cited
2012
We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly… (More)
  • figure 1
Is this relevant?
Highly Cited
2012
Highly Cited
2012
Nonnegative matrix factorization (NMF) is a powerful matrix decomposition technique that approximates a nonnegative matrix by the… (More)
  • table I
  • figure 1
  • figure 2
  • table II
  • figure 3
Is this relevant?
Highly Cited
2009
Highly Cited
2009
We consider the minimization of a smooth loss function regularized by the trace norm of the matrix variable. Such formulation… (More)
  • table 1
Is this relevant?
Highly Cited
2007
Highly Cited
2007
In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two… (More)
Is this relevant?
Highly Cited
2001
Highly Cited
2001
We provide a natural gradient method that represents the steepest descent direction based on the underlying structure of the… (More)
  • figure 1
  • figure 2
Is this relevant?
Highly Cited
2000
Highly Cited
2000
We consider the gradient method xt+1 = xt + γt(st + wt), where st is a descent direction of a function f : <n → < and wt is a… (More)
Is this relevant?
Highly Cited
2000
Highly Cited
2000
Despite many decades of research into mobile robot control, reliable, high-speed motion in complicated, uncertain environments… (More)
  • figure 1
  • figure 2
  • figure 3
  • figure 4
  • figure 5
Is this relevant?