You are currently offline. Some features of the site may not work correctly.

Corpus ID: 232076092

Asymptotic Risk of Overparameterized Likelihood Models: Double Descent Theory for Deep Neural Networks

@article{Nakada2021AsymptoticRO,
title={Asymptotic Risk of Overparameterized Likelihood Models: Double Descent Theory for Deep Neural Networks},
author={Ryumei Nakada and M. Imaizumi},
journal={ArXiv},
year={2021},
volume={abs/2103.00500}
}

We investigate the asymptotic risk of a general class of overparameterized likelihood models, including deep models. The recent empirical success of large-scale models has motivated several theoretical studies to investigate a scenario wherein both the number of samples, n, and parameters, p, diverge to infinity and derive an asymptotic risk at the limit. However, these theorems are only valid for linear-in-feature models, such as generalized linear regression, kernel regression, and shallow… Expand