Risk-Monotonicity in Statistical Learning
@inproceedings{Mhammedi2020RiskMonotonicityIS, title={Risk-Monotonicity in Statistical Learning}, author={Zakaria Mhammedi and H. Husain}, year={2020} }
Acquisition of data is a difficult task in many applications of machine learning, and it is only natural that one hopes and expects the populating risk to decrease (better performance) monotonically with increasing data points. It turns out, somewhat surprisingly, that this is not the case even for the most standard algorithms such as empirical risk minimization. Nonmonotonic behaviour of the risk and instability in training have manifested and appeared in the popular deep learning paradigm… Expand
One Citation
References
SHOWING 1-10 OF 53 REFERENCES
Minimizers of the Empirical Risk and Risk Monotonicity
- Computer Science, Mathematics
- NeurIPS
- 2019
- 13
- Highly Influential
- PDF
The generalization error of random features regression: Precise asymptotics and double descent curve
- Mathematics
- 2019
- 167
- PDF
A Geometric Look at Double Descent Risk: Volumes, Singularities, and Distinguishabilities
- Mathematics, Computer Science
- ArXiv
- 2020
- 2
Reconciling modern machine learning and the bias-variance trade-off
- Computer Science, Mathematics
- ArXiv
- 2018
- 103
- PDF
Fast rates in statistical and online learning
- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2015
- 62
- Highly Influential
- PDF
Triple descent and the two kinds of overfitting: Where & why do they appear?
- Computer Science, Physics
- NeurIPS
- 2020
- 12
- PDF
Fast rates with high probability in exp-concave statistical learning
- Mathematics, Computer Science
- AISTATS
- 2017
- 25
- Highly Influential
- PDF