Pruning Using Parameter and Neuronal Metrics

  title={Pruning Using Parameter and Neuronal Metrics},
  author={Pi{\"e}rre van de Laar and Tom Heskes},
  journal={Neural Computation},
In this article, we introduce a measure of optimality for architecture selection algorithms for neural networks: the distance from the original network to the new network in a metric defined by the probability distributions of all possible networks. We derive two pruning algorithms, one based on a metric in parameter space and the other based on a metric in neuron space, which are closely related to well-known architecture selection algorithms, such as GOBS. Our framework extends the… CONTINUE READING

From This Paper

Figures, tables, and topics from this paper.


Publications citing this paper.
Showing 1-10 of 15 extracted citations

Pruning hidden Markov models with optimal brain surgeon

IEEE Transactions on Speech and Audio Processing • 2005
View 5 Excerpts
Highly Influenced

Ranking the parameters of deep neural networks using the fisher information

2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2016
View 3 Excerpts


Publications referenced by this paper.
Showing 1-10 of 27 references

The MONK’s problems: A performance comparison of different learning algorithms

S. B. Thrun, J. Bala, +21 authors J. Zhang
View 8 Excerpts
Highly Influenced

Specification and assessment of methods supporting the development of neural networks in medicine

M. Egmont-Petersen
View 1 Excerpt

Structural learning with forgetting

Neural Networks • 1996
View 2 Excerpts

Similar Papers

Loading similar papers…