Discovering Neural Nets with Low Kolmogorov Complexity and High Generalization Capability

@inproceedings{SchmidhuberIDSIA1997DiscoveringNN,
  title={Discovering Neural Nets with Low Kolmogorov Complexity and High Generalization Capability},
  author={urgen SchmidhuberIDSIA and Corso Elvezia},
  year={1997}
}
  • urgen SchmidhuberIDSIA, Corso Elvezia
  • Published 1997
Many neural net learning algorithms aim at nding \simple" nets to explain training data. The expectation is: the \simpler" the networks, the better the generalization on test data (! Occam's razor). Previous implementations, however, use measures for \simplicity" that lack the power, universality and elegance of those based on Kolmogorov complexity and Solomonoo's algorithmic probability. Likewise, most previous approaches (especially those of the \Bayesian" kind) suuer from the problem of… CONTINUE READING
Highly Influential
This paper has highly influenced 10 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 112 citations. REVIEW CITATIONS
68 Citations
33 References
Similar Papers

Citations

Publications citing this paper.

113 Citations

051015'96'01'07'13
Citations per Year
Semantic Scholar estimates that this publication has 113 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 33 references

A general method for incremental self - improvement and multiagentlearning in unrestricted environments

  • J. Schmidhuber, J. Zhao, M. Wiering
  • 1996
1 Excerpt

Incremental self - improvement for life - time multiagentreinforcement learning

  • P. Maes, M. Mataric, +4 authors L. A. Levin
  • 1996

Simple principles of metalearning

  • C. P. IDSIA. Schnorr
  • 1996

Over tting avoidance as bias

  • C. Scha er
  • Machine Learning
  • 1993

Kolmogorov complexity and computational complexity

  • O. Watanabe
  • EATCS Monographson Theoretical Computer Science
  • 1992
1 Excerpt

Similar Papers

Loading similar papers…