Corpus ID: 214612494

A new regret analysis for Adam-type algorithms

@article{Alacaoglu2020ANR,
  title={A new regret analysis for Adam-type algorithms},
  author={Ahmet Alacaoglu and Yura Malitsky and Panayotis Mertikopoulos and Volkan Cevher},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.09729}
}
  • Ahmet Alacaoglu, Yura Malitsky, +1 author Volkan Cevher
  • Published 2020
  • Mathematics, Computer Science
  • ArXiv
  • In this paper, we focus on a theory-practice gap for Adam and its variants (AMSgrad, AdamNC, etc.). In practice, these algorithms are used with a constant first-order moment parameter $\beta_{1}$ (typically between $0.9$ and $0.99$). In theory, regret guarantees for online convex optimization require a rapidly decaying $\beta_{1}\to0$ schedule. We show that this is an artifact of the standard analysis and propose a novel framework that allows us to derive optimal, data-dependent regret bounds… CONTINUE READING

    Topics from this paper.

    Citations

    Publications citing this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 18 REFERENCES

    SAdam: A Variant of Adam for Strongly Convex Functions

    VIEW 14 EXCERPTS
    HIGHLY INFLUENTIAL

    On the Convergence of Adam and Beyond

    VIEW 37 EXCERPTS
    HIGHLY INFLUENTIAL

    Adam: A Method for Stochastic Optimization

    VIEW 6 EXCERPTS
    HIGHLY INFLUENTIAL

    A Sufficient Condition for Convergences of Adam and RMSProp

    VIEW 2 EXCERPTS

    On the Convergence of AdaBound and its Connection to SGD

    VIEW 1 EXCERPT

    ZO-AdaMM: Zeroth-Order Adaptive Momentum Method for Black-Box Optimization

    VIEW 19 EXCERPTS
    HIGHLY INFLUENTIAL

    Logarithmic regret algorithms for online convex optimization

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL