# Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation

@article{Belkin2021FitWF, title={Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation}, author={Mikhail Belkin}, journal={ArXiv}, year={2021}, volume={abs/2105.14368} }

In the past decade the mathematical theory of machine learning has lagged far behind the triumphs of deep neural networks on practical challenges. However, the gap between theory and practice is gradually starting to close. In this paper I will attempt to assemble some pieces of the remarkable and still incomplete mathematical mosaic emerging from the efforts to understand the foundations of deep learning. The two key themes will be interpolation, and its sibling, over-parameterization… Expand

#### Figures from this paper

#### References

SHOWING 1-10 OF 105 REFERENCES

The generalization error of random features regression: Precise asymptotics and double descent curve

- Mathematics
- 2019

Uniform convergence may be unable to explain generalization in deep learning

- Computer Science, Mathematics
- NeurIPS
- 2019

The Power of Interpolation: Understanding the Effectiveness of SGD in Modern Over-parametrized Learning

- Computer Science, Mathematics
- ICML
- 2018

To understand deep learning we need to understand kernel learning

- Computer Science, Mathematics
- ICML
- 2018

Understanding deep learning requires rethinking generalization

- Computer Science
- ICLR
- 2017

Harmless interpolation of noisy data in regression

- Computer Science, Mathematics
- 2019 IEEE International Symposium on Information Theory (ISIT)
- 2019

Understanding overfitting peaks in generalization error: Analytical risk curves for l2 and l1 penalized interpolation

- Computer Science, Physics
- ArXiv
- 2019

Loss landscapes and optimization in over-parameterized non-linear systems and neural networks

- Computer Science, Mathematics
- 2020