Corpus ID: 229376930

Risk-Monotonicity in Statistical Learning

@inproceedings{Mhammedi2020RiskMonotonicityIS,
  title={Risk-Monotonicity in Statistical Learning},
  author={Zakaria Mhammedi and H. Husain},
  year={2020}
}
Acquisition of data is a difficult task in many applications of machine learning, and it is only natural that one hopes and expects the populating risk to decrease (better performance) monotonically with increasing data points. It turns out, somewhat surprisingly, that this is not the case even for the most standard algorithms such as empirical risk minimization. Nonmonotonic behaviour of the risk and instability in training have manifested and appeared in the popular deep learning paradigm… Expand
1 Citations
The Shape of Learning Curves: a Review
  • PDF

References

SHOWING 1-10 OF 53 REFERENCES
Minimizers of the Empirical Risk and Risk Monotonicity
  • 13
  • Highly Influential
  • PDF
Convexity, Classification, and Risk Bounds
  • 1,039
  • PDF
Optimal Regularization Can Mitigate Double Descent
  • 33
  • PDF
A Geometric Look at Double Descent Risk: Volumes, Singularities, and Distinguishabilities
  • 2
Reconciling modern machine learning and the bias-variance trade-off
  • 103
  • PDF
Fast rates in statistical and online learning
  • 62
  • Highly Influential
  • PDF
Bounds on Individual Risk for Log-loss Predictors
  • 5
  • PDF
Triple descent and the two kinds of overfitting: Where & why do they appear?
  • 12
  • PDF
Fast rates with high probability in exp-concave statistical learning
  • 25
  • Highly Influential
  • PDF
...
1
2
3
4
5
...