Adam: A Method for Stochastic Optimization

@article{Kingma2014AdamAM,
  title={Adam: A Method for Stochastic Optimization},
  author={Diederik P. Kingma and Jimmy Ba},
  journal={CoRR},
  year={2014},
  volume={abs/1412.6980}
}
We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and… CONTINUE READING
Highly Influential
This paper has highly influenced 2,566 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 20,934 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 12,611 extracted citations

A Comparison of Evolutionary Algorithms and Gradient-based Methods for the Optimal Control Problem

2018 5th International Conference on Control, Decision and Information Technologies (CoDIT) • 2018
View 7 Excerpts
Highly Influenced

A Deep Convolutional Neural Network Based Classification Of Multi-Class Motor Imagery With Improved Generalization

2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) • 2018
View 4 Excerpts
Highly Influenced

A Generic Approach for Escaping Saddle points

View 7 Excerpts
Highly Influenced

20,935 Citations

0500010000'14'16'18
Citations per Year
Semantic Scholar estimates that this publication has 20,935 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-5 of 5 references

Adam: a Method for Stochastic Optimization

• Diederik P. Kingma, Jimmy Lei Ba
International Conference on Learning Representations , • 2015

Similar Papers

Loading similar papers…