Escaping From Saddle Points - Online Stochastic Gradient for Tensor Decomposition

@inproceedings{Ge2015EscapingFS,
  title={Escaping From Saddle Points - Online Stochastic Gradient for Tensor Decomposition},
  author={Rong Ge and Furong Huang and Chi Jin and Yang Yuan},
  booktitle={COLT},
  year={2015}
}
We analyze stochastic gradient descent for optimizing non-convex functions. In many cases for non-convex functions the goal is to find a reasonable local minimum, and the main concern is that gradient updates are trapped in saddle points. In this paper we identify strict saddle property for non-convex problem that allows for efficient optimization. Using this property we show that from an arbitrary starting point, stochastic gradient descent converges to a local minimum in a polynomial number… CONTINUE READING
Highly Influential
This paper has highly influenced 40 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 376 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 18 times over the past 90 days. VIEW TWEETS

Citations

Publications citing this paper.
Showing 1-10 of 254 extracted citations

377 Citations

05010015020142015201620172018
Citations per Year
Semantic Scholar estimates that this publication has 377 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 28 references

Independent component analysis, volume 46

  • Aapo Hyvärinen, Juha Karhunen, Erkki Oja
  • 2004
Highly Influential
10 Excerpts

Similar Papers

Loading similar papers…