Corpus ID: 173188712

DeepShift: Towards Multiplication-Less Neural Networks

@article{Elhoushi2019DeepShiftTM,
  title={DeepShift: Towards Multiplication-Less Neural Networks},
  author={Mostafa Elhoushi and Farhan Shafiq and Y. Tian and Joey Yiwei Li and Zihao Chen},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.13298}
}
Deployment of convolutional neural networks (CNNs) in mobile environments, their high computation and power budgets proves to be a major bottleneck. Convolution layers and fully connected layers, because of their intense use of multiplications, are the dominant contributer to this computation budget. This paper proposes to tackle this problem by introducing two new operations: convolutional shifts and fully-connected shifts, that replace multiplications all together with bitwise shift and sign… Expand
ShiftAddNet: A Hardware-Inspired Deep Network
Bipolar morphological U-Net for document binarization
Pruning and Quantization for Deep Neural Network Acceleration: A Survey
AdderNet and its Minimalist Hardware Design for Energy-Efficient Artificial Intelligence
Design Considerations for Edge Neural Network Accelerators: An Industry Perspective
  • Arnab Raha, Sang Kyun Kim, +5 authors G. Chinya
  • Computer Science
  • 2021 34th International Conference on VLSI Design and 2021 20th International Conference on Embedded Systems (VLSID)
  • 2021
Multiplierless MP-Kernel Machine For Energy-efficient Edge Devices
ENOS: Energy-Aware Network Operator Search for Hybrid Digital and Compute-in-Memory DNN Accelerators
ResNet-like Architecture with Low Hardware Requirements

References

SHOWING 1-10 OF 39 REFERENCES
Towards Accurate Binary Convolutional Neural Network
BinaryConnect: Training Deep Neural Networks with binary weights during propagations
Trained Ternary Quantization
...
1
2
3
4
...