Reducing Duplicate Filters in Deep Neural Networks
@inproceedings{RoyChowdhury2018ReducingDF, title={Reducing Duplicate Filters in Deep Neural Networks}, author={Aruni RoyChowdhury and Prakhar Sharma and E. Learned-Miller}, year={2018} }
This paper investigates the presence of duplicate neurons or filters in neural networks. This phenomenon is prevalent in networks and increases with the number of filters in a layer. We observe the emergence of duplicate filters over training iterations, study the factors that affect their concentration and compare existing network reducing operations. We validate our findings using convolutional and fully-connected networks on the CIFAR-10 dataset.
Figures and Tables from this paper
14 Citations
Exploiting Channel Similarity for Accelerating Deep Convolutional Neural Networks
- Computer Science, Mathematics
- ArXiv
- 2019
- 1
- PDF
Variable batch size across layers for efficient prediction on CNNs
- Computer Science
- 2020 IEEE 13th International Conference on Cloud Computing (CLOUD)
- 2020
Compression of Deep Neural Networks by Combining Pruning and Low Rank Decomposition
- Computer Science
- 2019 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)
- 2019
- 1
Understanding Convolutional Neural Network Training with Information Theory
- Computer Science
- ArXiv
- 2018
- 21
References
SHOWING 1-10 OF 22 REFERENCES
Reducing Overfitting in Deep Networks by Decorrelating Representations
- Computer Science, Mathematics
- ICLR
- 2016
- 212
- PDF
Pruning Convolutional Neural Networks for Resource Efficient Inference
- Computer Science, Mathematics
- ICLR
- 2017
- 615
- Highly Influential
Improving neural networks by preventing co-adaptation of feature detectors
- Computer Science
- ArXiv
- 2012
- 5,101
- PDF
Convergent Learning: Do different neural networks learn the same representations?
- Computer Science, Mathematics
- FE@NIPS
- 2015
- 139
- PDF