Skip to search formSkip to main contentSkip to account menu

Gradient descent

Known as: Descent, Gradient descent optimization, Gradient descent method 
Gradient descent is a first-order iterative optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps… 
Wikipedia

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2019
Highly Cited
2019
One of the mysteries in the success of neural networks is randomly initialized first order methods like gradient descent can… 
Highly Cited
2018
Highly Cited
2018
We examine gradient descent on unregularized logistic regression problems, with homogeneous linear predictors on linearly… 
Review
2016
Review
2016
Gradient descent optimization algorithms, while increasingly popular, are often used as black-box optimizers, as practical… 
Highly Cited
2016
Highly Cited
2016
The move from hand-designed features to learned features in machine learning has been wildly successful. In spite of this… 
Highly Cited
2016
Highly Cited
2016
We propose a general purpose variational inference algorithm that forms a natural counterpart of gradient descent for… 
Highly Cited
2009
Highly Cited
2009
We consider the problem of minimizing the sum of a smooth function and a separable convex function. This problem includes as… 
Highly Cited
2005
Highly Cited
2005
We investigate using gradient descent methods for learning ranking functions; we propose a simple probabilistic cost function… 
Highly Cited
2005
Highly Cited
2005
We study a general online convex optimization problem. We have a convex set <i>S</i> and an unknown sequence of cost functions <i… 
Highly Cited
1999
Highly Cited
1999
We provide an abstract characterization of boosting algorithms as gradient decsent on cost-functionals in an inner-product… 
Highly Cited
1997
Highly Cited
1997
We consider two algorithm for on-line prediction based on a linear model. The algorithms are the well-known Gradient Descent (GD…