Corpus ID: 231632891

KCP: Kernel Cluster Pruning for Dense Labeling Neural Networks

@article{Yu2021KCPKC,
  title={KCP: Kernel Cluster Pruning for Dense Labeling Neural Networks},
  author={Po-Hsiang Yu and Sih-Sian Wu and L. Chen},
  journal={ArXiv},
  year={2021},
  volume={abs/2101.06686}
}
Pruning has become a promising technique used to compress and accelerate neural networks. Existing methods are mainly evaluated on spare labeling applications. However, dense labeling applications are those closer to real world problems that require real-time processing on resourceconstrained mobile devices. Pruning for dense labeling applications is still a largely unexplored field. The prevailing filter channel pruning method removes the entire filter channel. Accordingly, the interaction… Expand

References

SHOWING 1-10 OF 42 REFERENCES
Pruning Filters for Efficient ConvNets
HRank: Filter Pruning Using High-Rank Feature Map
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration
Learning Filter Pruning Criteria for Deep Convolutional Neural Networks Acceleration
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Importance Estimation for Neural Network Pruning
Neural Network Pruning With Residual-Connections and Limited-Data
  • Jian-Hao Luo, Jianxin Wu
  • Computer Science
  • 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2020
Structured Pruning of Neural Networks With Budget-Aware Regularization
Rethinking the Value of Network Pruning
ThiNet: A Filter Level Pruning Method for Deep Neural Network Compression
...
1
2
3
4
5
...