Automated Inference with Adaptive Batches

@inproceedings{De2017AutomatedIW,
  title={Automated Inference with Adaptive Batches},
  author={Soham De and Abhay Kumar Yadav and David W. Jacobs and Tom Goldstein},
  booktitle={AISTATS},
  year={2017}
}
Classical stochastic gradient methods for optimization rely on noisy gradient approximations that become progressively less accurate as iterates approach a solution. The large noise and small signal in the resulting gradients makes it di cult to use them for adaptive stepsize selection and automatic stopping. We propose alternative “big batch” SGD schemes that adaptively grow the batch size over time to maintain a nearly constant signal-to-noise ratio in the gradient approximation. The… CONTINUE READING
12 Citations
37 References
Similar Papers

References

Publications referenced by this paper.
Showing 1-10 of 37 references

Twopoint step size gradient methods

  • Jonathan Barzilai, Jonathan M Borwein
  • IMA Journal of Numerical Analysis,
  • 1988
Highly Influential
4 Excerpts

Similar Papers

Loading similar papers…