Skip to search form
Skip to main content
Skip to account menu
Semantic Scholar
Semantic Scholar's Logo
Search 218,237,304 papers from all fields of science
Search
Sign In
Create Free Account
Gradient descent
Known as:
Descent
, Gradient descent optimization
, Gradient descent method
Expand
Gradient descent is a first-order iterative optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps…
Expand
Wikipedia
(opens in a new tab)
Create Alert
Alert
Related topics
Related topics
50 relations
AdaBoost
Backpropagation
Boltzmann machine
Conditional random field
Expand
Papers overview
Semantic Scholar uses AI to extract papers important to this topic.
Highly Cited
2018
Highly Cited
2018
Convergence of Gradient Descent on Separable Data
M. S. Nacson
,
J. Lee
,
Suriya Gunasekar
,
N. Srebro
,
Daniel Soudry
International Conference on Artificial…
2018
Corpus ID: 3692345
We provide a detailed study on the implicit bias of gradient descent when optimizing loss functions with strictly monotone tails…
Expand
Highly Cited
2017
Highly Cited
2017
Globally Optimal Gradient Descent for a ConvNet with Gaussian Inputs
Alon Brutzkus
,
A. Globerson
International Conference on Machine Learning
2017
Corpus ID: 13000960
Deep learning models are often successfully trained using gradient descent, despite the worst case hardness of the underlying non…
Expand
Highly Cited
2016
Highly Cited
2016
Gradient Descent Converges to Minimizers
J. Lee
,
Max Simchowitz
,
Michael I. Jordan
,
B. Recht
arXiv.org
2016
Corpus ID: 17905941
We show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by…
Expand
Highly Cited
2015
Highly Cited
2015
Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees
Yudong Chen
,
M. Wainwright
arXiv.org
2015
Corpus ID: 7325349
Optimization problems with rank constraints arise in many applications, including matrix regression, structured PCA, matrix…
Expand
Highly Cited
2014
Highly Cited
2014
Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
Zeyuan Allen-Zhu
,
L. Orecchia
Information Technology Convergence and Services
2014
Corpus ID: 3249321
First-order methods play a central role in large-scale machine learning. Even though many variations exist, each suited to a…
Expand
Highly Cited
2011
Highly Cited
2011
Distributed algorithms via gradient descent for fisher markets
Benjamin E. Birnbaum
,
Nikhil R. Devanur
,
Lin Xiao
ACM Conference on Economics and Computation
2011
Corpus ID: 1397955
Designing distributed algorithms that converge quickly to an equilibrium is one of the foremost research goals in algorithmic…
Expand
2010
2010
Accelerated gradient descent methods with line search
P. Stanimirović
,
M. Miladinovic
Numerical Algorithms
2010
Corpus ID: 28495830
We introduced an algorithm for unconstrained optimization based on the transformation of the Newton method with the line search…
Expand
Highly Cited
2009
Highly Cited
2009
Gradient descent with sparsification: an iterative algorithm for sparse recovery with restricted isometry property
R. Garg
,
R. Khandekar
International Conference on Machine Learning
2009
Corpus ID: 5947825
We present an algorithm for finding an <i>s</i>-sparse vector <i>x</i> that minimizes the <i>square-error</i> ∥<i>y</i> -- Φ<i>x…
Expand
Highly Cited
2007
Highly Cited
2007
Boosting Algorithms as Gradient Descent in Function Space
Llew Mason
,
Jonathan Baxter
,
P. Bartlett
,
Marcus Frean
2007
Corpus ID: 56563221
Much recent attention, both experimental and theoretical, has been focussed on classii-cation algorithms which produce voted…
Expand
Highly Cited
2004
Highly Cited
2004
A generalized normalized gradient descent algorithm
D. Mandic
IEEE Signal Processing Letters
2004
Corpus ID: 11833367
A generalized normalized gradient descent (GNGD) algorithm for linear finite-impulse response (FIR) adaptive filters is…
Expand
By clicking accept or continuing to use the site, you agree to the terms outlined in our
Privacy Policy
(opens in a new tab)
,
Terms of Service
(opens in a new tab)
, and
Dataset License
(opens in a new tab)
ACCEPT & CONTINUE