Generalization performance of regularized neural network models

@inproceedings{Larsen1994GeneralizationPO,
  title={Generalization performance of regularized neural network models},
  author={Jan Larsen and Lars Kai Hansen},
  year={1994}
}
Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization normally improves the generalization performance by restricting the model complexity. A formula for the optimal weight decay regularizer is derived. A regularized model may be characterized by an… CONTINUE READING

Citations

Publications citing this paper.
Showing 1-10 of 49 extracted citations

References

Publications referenced by this paper.
Showing 1-8 of 8 references

Lippmann (eds.) Advances in Neural Information Processing Systems

  • R.P.J. Hanson
  • Proceedings of the 1991 Conference, San Mateo…
  • 1992

The E ective Number of Parameters : An Analysis of Generalization and Regularization in Nonlinear Learning Systems

  • J. E. Moody, S. J. Hanson, R. P. Lippmann
  • Proceedings of the rst IEEE Workshop on Neural…
  • 1991

A Simple Weight Decay Can Improve Generaliza - tiori

  • J. E. Moody, S. J. Hanson, R. P. Lippmann
  • Advances in Neural Information Processing Systems…

Fitting Autoregressive Models for Prediction

  • E. Bienenstock S. Geman, R. Doursat

arsen , “ A Generalization Error Estimate for Nonlinear Systems

  • J. Aa. Sorensen
  • Keural Networks for Signal Processing 2…

Similar Papers

Loading similar papers…