Corpus ID: 225040271

In Search of Robust Measures of Generalization

@article{Dziugaite2020InSO,
  title={In Search of Robust Measures of Generalization},
  author={G. Dziugaite and Alexandre Drouin and Brady Neal and Nitarshan Rajkumar and Ethan Caballero and Linbo Wang and Ioannis Mitliagkas and Daniel M. Roy},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.11924}
}
One of the principal scientific challenges in deep learning is explaining generalization, i.e., why the particular way the community now trains networks to achieve small training error also leads to small error on held-out data from the same population. It is widely appreciated that some worst-case theories -- such as those based on the VC dimension of the class of predictors induced by modern neural network architectures -- are unable to explain empirical performance. A large volume of work… Expand

Figures from this paper

Generalization bounds for deep learning
Compressing Heavy-Tailed Weight Matrices for Non-Vacuous Generalization Bounds
Generalization bounds via distillation
Measuring Generalization with Optimal Transport
PAC Bayesian Performance Guarantees for Deep (Stochastic) Networks in Medical Imaging
...
1
2
...

References

SHOWING 1-10 OF 38 REFERENCES
Fantastic Generalization Measures and Where to Find Them
Uniform convergence may be unable to explain generalization in deep learning
Predicting the Generalization Gap in Deep Networks with Margin Distributions
Understanding deep learning requires rethinking generalization
Invariant Models for Causal Transfer Learning
...
1
2
3
4
...