Corpus ID: 1091965

Efficient Sparse-Winograd Convolutional Neural Networks

@article{Liu2017EfficientSC,
  title={Efficient Sparse-Winograd Convolutional Neural Networks},
  author={Xingyu Liu and Jeff Pool and Song Han and W. Dally},
  journal={ArXiv},
  year={2017},
  volume={abs/1802.06367}
}
Convolutional Neural Networks (CNNs) are computationally intensive, which limits their application on mobile devices. [...] Key Method We propose two modifications to Winograd-based CNNs to enable these methods to exploit sparsity. First, we move the ReLU operation into the Winograd domain to increase the sparsity of the transformed activations. Second, we prune the weights in the Winograd domain to exploit static weight sparsity. For models on CIFAR-10, CIFAR-100 and ImageNet datasets, our method reduces the…Expand
68 Citations
Jointly Sparse Convolutional Neural Networks in Dual Spatial-winograd Domains
  • Y. Choi, Mostafa El-Khamy, J. Lee
  • Computer Science
  • ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2019
  • 2
  • Highly Influenced
  • PDF
Enabling Sparse Winograd Convolution by Native Pruning
  • 33
  • PDF
A Low-latency Sparse-Winograd Accelerator for Convolutional Neural Networks
  • 5
  • PDF
SpWMM: A High-Performance Sparse-Winograd Matrix-Matrix Multiplication Accelerator for CNNs
  • Di Wu, Wei Cao, L. Wang
  • Computer Science
  • 2019 International Conference on Field-Programmable Technology (ICFPT)
  • 2019
  • 2
  • Highly Influenced
Compression of Deep Convolutional Neural Networks under Joint Sparsity Constraints
  • 5
  • Highly Influenced
  • PDF
Efficient Residue Number System Based Winograd Convolution
  • PDF
DWM: A Decomposable Winograd Method for Convolution Acceleration
  • 3
  • PDF
Skipping CNN Convolutions Through Efficient Memoization
  • 1
  • PDF
SWM: A High-Performance Sparse-Winograd Matrix Multiplication CNN Accelerator
  • Di Wu, Xitian Fan, Wei Cao, Lingli Wang
  • Computer Science
  • IEEE Transactions on Very Large Scale Integration (VLSI) Systems
  • 2021
  • Highly Influenced
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 28 REFERENCES
Enabling Sparse Winograd Convolution by Native Pruning
  • 33
  • Highly Influential
  • PDF
XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
  • 1,813
  • PDF
Striving for Simplicity: The All Convolutional Net
  • 2,570
  • Highly Influential
  • PDF
Fast Algorithms for Convolutional Neural Networks
  • Andrew Lavin, Scott Gray
  • Computer Science
  • 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2016
  • 502
  • Highly Influential
  • PDF
ImageNet classification with deep convolutional neural networks
  • 62,892
  • Highly Influential
  • PDF
Very Deep Convolutional Networks for Large-Scale Image Recognition
  • 47,817
  • Highly Influential
  • PDF
Learning both Weights and Connections for Efficient Neural Network
  • 3,050
  • PDF
Neural Networks with Few Multiplications
  • 245
  • PDF
Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding
  • 4,455
  • PDF
Going deeper with convolutions
  • 23,575
  • Highly Influential
  • PDF
...
1
2
3
...