BinaryNet: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1

@article{Courbariaux2016BinaryNetTD,
  title={BinaryNet: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1},
  author={Matthieu Courbariaux and Yoshua Bengio},
  journal={CoRR},
  year={2016},
  volume={abs/1602.02830}
}
We introduce BinaryNet, a method which trains DNNs with binary weights and activations when computing parameters’ gradient. We show that it is possible to train a Multi Layer Perceptron (MLP) on MNIST and ConvNets on CIFAR-10 and SVHN with BinaryNet and achieve nearly state-of-the-art results. At run-time, BinaryNet drastically reduces memory usage and replaces most multiplications by 1-bit exclusive-not-or (XNOR) operations, which might have a big impact on both general-purpose and dedicated… CONTINUE READING
Highly Influential
This paper has highly influenced 17 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 270 citations. REVIEW CITATIONS
Related Discussions
This paper has been referenced on Twitter 374 times. VIEW TWEETS

Citations

Publications citing this paper.
Showing 1-10 of 180 extracted citations

Multi-precision convolutional neural networks on heterogeneous hardware

2018 Design, Automation & Test in Europe Conference & Exhibition (DATE) • 2018
View 4 Excerpts
Highly Influenced

FxpNet: Training a deep convolutional neural network in fixed-point representation

2017 International Joint Conference on Neural Networks (IJCNN) • 2017
View 10 Excerpts
Highly Influenced

270 Citations

0501002016201720182019
Citations per Year
Semantic Scholar estimates that this publication has 270 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 44 references

Similar Papers

Loading similar papers…