Skip to search formSkip to main contentSkip to account menu

Stochastic gradient descent

Known as: Gradient descent in machine learning, SGD (disambiguation), AdaGrad 
Stochastic gradient descent (often shortened in SGD), also known as incremental gradient descent, is a stochastic approximation of the gradient… 
Wikipedia (opens in a new tab)

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2018
2018
In recent decades, the amount of data available has grown immensely. A lot of this data may be private or sensitive. Privacy of… 
2018
2018
Understanding the generalization of deep learning has raised lots of concerns recently, where the learning algorithms play an… 
2015
2015
Irregular algorithms such as Stochastic Gradient Descent (SGD) can benefit from the massive parallelism available on GPUs… 
2014
2014
Stochastic gradient algorithms have been the main focus of large-scale learning problems and they led to important successes in… 
2005
2005
A stochastic projection method (SPM) is developed for quantitative propagation of uncertainty in compressible zero-Mach-number… 
Highly Cited
2001
Highly Cited
2001
A stochastic MIMO radio channel considering (i) polarization diversity and (ii) unbalanced branch power ratio (BPR) is being… 
1998
1998
We present a stochastic clustering algorithm based on pairwise similarity of datapoints. Our method extends existing… 
Highly Cited
1993
Highly Cited
1993
The problem of image decompression is cast as an ill-posed inverse problem, and a stochastic regularization technique is used to… 
Highly Cited
1990
Highly Cited
1990
The authors describe several procedures that simplify the search in stochastic coders, but do not put constraints on the… 
Highly Cited
1965