Online Learning, Stability, and Stochastic Gradient Descent

  title={Online Learning, Stability, and Stochastic Gradient Descent},
  author={Tomaso A. Poggio and Stephen Voinea and Lorenzo Rosasco},
In batch learning, stability together with existence and uniqueness of the solution corresponds to well-posedness of Empirical Risk Minimization (ERM) methods; recently, it was proved that CV loo stability is necessary and sufficient for generalization and consistency of ERM ([9]). In this note, we introduce CV on stability, which plays a similar role in online learning. We show that stochastic gradient descent (SDG) with the usual hypotheses is CV on stable and we then discuss the implications… CONTINUE READING
  • GitHub repos referencing this paper

  • Presentations referencing similar topics