Corpus ID: 15604580

Bitwise Neural Networks

@article{Kim2016BitwiseNN,
  title={Bitwise Neural Networks},
  author={Minje Kim and P. Smaragdis},
  journal={ArXiv},
  year={2016},
  volume={abs/1601.06071}
}
Based on the assumption that there exists a neural network that efficiently represents a set of Boolean functions between all binary inputs and outputs, we propose a process for developing and deploying neural networks whose weight parameters, bias terms, input, and intermediate hidden layer output signals, are all binary-valued, and require only basic bit logic for the feedforward pass. [...] Key Method In order to design such networks, we propose to add a few training schemes, such as weight compression and…Expand

Paper Mentions

Blog Post
Binary Ensemble Neural Network: More Bits per Network or More Networks per Bit?
  • Shilin Zhu, Xin Dong, Hao Su
  • Computer Science, Mathematics
  • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2019
Bitwise Neural Networks for Efficient Single-Channel Source Separation
  • Minje Kim, P. Smaragdis
  • Computer Science
  • 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2018
Low-Complexity Approximate Convolutional Neural Networks
Balanced Quantization: An Effective and Efficient Approach to Quantized Neural Networks
A Survey on Methods and Theories of Quantized Neural Networks
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 22 REFERENCES
Fixed-point feedforward deep neural network design using weights +1, 0, and −1
Training deep neural networks with low precision multiplications
Low precision arithmetic for deep learning
Dropout: a simple way to prevent neural networks from overfitting
On Learning µ-Perceptron Networks with Binary Weights
A Fast Learning Algorithm for Deep Belief Nets
Weight discretization paradigm for optical neural networks
GradientBased Learning Applied to Document Recognition
Gradient-based learning applied to document recognition
Learning Deep Architectures for AI
...
1
2
3
...