Improving generalization by using genetic algorithms to determine the neural network size

@article{Bebis1995ImprovingGB,
  title={Improving generalization by using genetic algorithms to determine the neural network size},
  author={George Bebis and Michael Georgiopoulos},
  journal={Proceedings of Southcon '95},
  year={1995},
  pages={392-397}
}
Recent theoretical results support that decreasing the number of free parameters in a neural network (i.e., weights) can improve generalization. The importance of these results has triggered the development of many approaches which try to determine an "appropriate" network size for a given problem. Although it has been demonstrated that most of the approaches manage to find small size networks which solve the problem at hand, it is quite remarkable that the generalization capabilities of these… CONTINUE READING

References

Publications referenced by this paper.
SHOWING 1-10 OF 12 REFERENCES

Weightdecay as a process of redundancy reduction

R. Kamimura, S. Nakanishi
  • Proceedings of World Congress on Neural Networks,
  • 1994

DOW, "Creating artificial neural networks that gcneralize

R. J. Sietsma
  • Neural Networks,
  • 1991

Progress in supervised neural networks " Karnin , " A simple procedure for pruning backpropagation trained neural networks "

P. Smolensky
  • IEEE Transactions on Neural Networks
  • 1990

Genetic algorithms in search, optimization, and machine learning, Addison-Wesley

D. Goldberg
  • 1989