Overtraining , Regularization , and Searching for Minimum Inneural Networksj

  title={Overtraining , Regularization , and Searching for Minimum Inneural Networksj},
  author={Oberg and Lennart Ljung},
Neural network models for dynamical systems have been subject of considerable interest lately. They are often characterized by the fact that they use a fairly large amount of parameters. Here we address the problem why this can be done without the usual penalty in terms of a large variance error. We show that regularization is a key explanation, and that terminating a gradient search (\backpropagation") before the true criterion minimum is found is a way of achieving regularization. This, among… CONTINUE READING
Highly Cited
This paper has 73 citations. REVIEW CITATIONS



Citations per Year

74 Citations

Semantic Scholar estimates that this publication has 74 citations based on the available data.

See our FAQ for additional information.