Corpus ID: 1091965

Efficient Sparse-Winograd Convolutional Neural Networks

@article{Liu2017EfficientSC,
  title={Efficient Sparse-Winograd Convolutional Neural Networks},
  author={Xingyu Liu and Jeff Pool and Song Han and W. Dally},
  journal={ArXiv},
  year={2017},
  volume={abs/1802.06367}
}
Convolutional Neural Networks (CNNs) are computationally intensive, which limits their application on mobile devices. [...] Key Method We propose two modifications to Winograd-based CNNs to enable these methods to exploit sparsity. First, we move the ReLU operation into the Winograd domain to increase the sparsity of the transformed activations. Second, we prune the weights in the Winograd domain to exploit static weight sparsity. For models on CIFAR-10, CIFAR-100 and ImageNet datasets, our method reduces the…Expand
Jointly Sparse Convolutional Neural Networks in Dual Spatial-winograd Domains
Enabling Sparse Winograd Convolution by Native Pruning
A Low-latency Sparse-Winograd Accelerator for Convolutional Neural Networks
SpWMM: A High-Performance Sparse-Winograd Matrix-Matrix Multiplication Accelerator for CNNs
  • Di Wu, Wei Cao, L. Wang
  • Computer Science
  • 2019 International Conference on Field-Programmable Technology (ICFPT)
  • 2019
Compression of Deep Convolutional Neural Networks under Joint Sparsity Constraints
Efficient Residue Number System Based Winograd Convolution
DWM: A Decomposable Winograd Method for Convolution Acceleration
Skipping CNN Convolutions Through Efficient Memoization
SWM: A High-Performance Sparse-Winograd Matrix Multiplication CNN Accelerator
  • Di Wu, Xitian Fan, Wei Cao, Lingli Wang
  • Computer Science
  • IEEE Transactions on Very Large Scale Integration (VLSI) Systems
  • 2021
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 28 REFERENCES
Enabling Sparse Winograd Convolution by Native Pruning
XNOR-Net: ImageNet Classification Using Binary Convolutional Neural Networks
Striving for Simplicity: The All Convolutional Net
Fast Algorithms for Convolutional Neural Networks
  • Andrew Lavin, Scott Gray
  • Computer Science
  • 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2016
ImageNet classification with deep convolutional neural networks
Very Deep Convolutional Networks for Large-Scale Image Recognition
Learning both Weights and Connections for Efficient Neural Network
Neural Networks with Few Multiplications
Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding
Going deeper with convolutions
...
1
2
3
...