Skip to search formSkip to main contentSkip to account menu

Stochastic gradient descent

Known as: Gradient descent in machine learning, SGD (disambiguation), AdaGrad 
Stochastic gradient descent (often shortened in SGD), also known as incremental gradient descent, is a stochastic approximation of the gradient… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
In recent decades, the amount of data available has grown immensely. A lot of this data may be private or sensitive. Privacy of… 
2015
2015
Irregular algorithms such as Stochastic Gradient Descent (SGD) can benefit from the massive parallelism available on GPUs… 
2014
2014
Stochastic variational inference (SVI) lets us scale up Bayesian computation to massive data. It uses stochastic optimization to… 
2012
2012
We show how to optimize a Support Vector Machine and a predictor for Collaborative Filtering with Stochastic Gradient Descent on… 
Highly Cited
2003
1998
1998
We present a stochastic clustering algorithm based on pairwise similarity of datapoints. Our method extends existing… 
1992
1992
It has since been shown that learning vector quantisation (LVQ) is a special case of a more general method, generalized… 
Highly Cited
1990
Highly Cited
1990
The authors describe several procedures that simplify the search in stochastic coders, but do not put constraints on the…