Nonconvex Variance Reduced Optimization with Arbitrary Sampling

@inproceedings{Horvath2018NonconvexVR,
  title={Nonconvex Variance Reduced Optimization with Arbitrary Sampling},
  author={Samuel Horv'ath and Peter Richt{\'a}rik},
  year={2018}
}
We provide the first importance sampling variants of variance-reduced algorithms for empirical risk minimization with non-convex loss functions. In particular, we analyze non-convex versions of SVRG, SAGA and SARAH. Our methods have the capacity to speed up the training process by an order of magnitude compared to the state of the art on real datasets. Moreover, we also improve upon current mini-batch analysis of these methods by proposing importance sampling for minibatches in this setting… CONTINUE READING
1
Twitter Mention

Citations

Publications citing this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 40 REFERENCES

Fast incremental method for smooth nonconvex optimization

  • 2016 IEEE 55th Conference on Decision and Control (CDC)
  • 2016
VIEW 13 EXCERPTS
HIGHLY INFLUENTIAL

Importance Sampling for Minibatches

  • J. Mach. Learn. Res.
  • 2016
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL

Coordinate descent with arbitrary sampling I: algorithms and complexity

  • Optimization Methods and Software
  • 2014
VIEW 6 EXCERPTS
HIGHLY INFLUENTIAL