BinaryNet: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1

@article{Courbariaux2016BinaryNetTD,
  title={BinaryNet: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1},
  author={Matthieu Courbariaux and Yoshua Bengio},
  journal={ArXiv},
  year={2016},
  volume={abs/1602.02830}
}
We introduce BinaryNet, a method which trains DNNs with binary weights and activations when computing parameters’ gradient. We show that it is possible to train a Multi Layer Perceptron (MLP) on MNIST and ConvNets on CIFAR-10 and SVHN with BinaryNet and achieve nearly state-of-the-art results. At run-time, BinaryNet drastically reduces memory usage and replaces most multiplications by 1-bit exclusive-not-or (XNOR) operations, which might have a big impact on both general-purpose and dedicated… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 285 CITATIONS, ESTIMATED 98% COVERAGE

A Targeted Acceleration and Compression Framework for Low bit Neural Networks

Biao Qian, Yang Wang
  • ArXiv
  • 2019
VIEW 17 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

DeltaNet: Differential Binary Neural Network

Yuka Oba, Kota Ando, Tetsuya Asai, Masato Motomura, Shinya Takamaeda-Yamazaki
  • 2019 IEEE 30th International Conference on Application-specific Systems, Architectures and Processors (ASAP)
  • 2019
VIEW 4 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Scaling Deep Spiking Neural Networks with Binary Stochastic Activations

  • 2019 IEEE International Conference on Cognitive Computing (ICCC)
  • 2019
VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Stochastic Quantization for Learning Accurate Low-Bit Deep Neural Networks

  • International Journal of Computer Vision
  • 2019
VIEW 13 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Development of an Autonomous Driving Robot Car Using FPGA

  • 2018 International Conference on Field-Programmable Technology (FPT)
  • 2018
VIEW 9 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Efficient Processing of Deep Neural Networks: A Tutorial and Survey

  • Proceedings of the IEEE
  • 2017
VIEW 4 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

A Review of Binarized Neural Networks

VIEW 10 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

HadaNets: Flexible Quantization Strategies for Neural Networks

  • CVPR Workshops
  • 2019
VIEW 8 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Design Flow of Accelerating Hybrid Extremely Low Bit-Width Neural Network in Embedded FPGA

  • 2018 28th International Conference on Field Programmable Logic and Applications (FPL)
  • 2018
VIEW 9 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Dither NN: An Accurate Neural Network with Dithering for Low Bit-Precision Hardware

  • 2018 International Conference on Field-Programmable Technology (FPT)
  • 2018
VIEW 6 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2015
2020

CITATION STATISTICS

  • 35 Highly Influenced Citations

  • Averaged 82 Citations per year from 2017 through 2019