• Published 2016

- LEVEL ACCURACY WITH 50 X FEWER PARAMETERS AND < 0 . 5 MB MODEL SIZE

@inproceedings{Iandola2016LA,
  title={- LEVEL ACCURACY WITH 50 X FEWER PARAMETERS AND < 0 . 5 MB MODEL SIZE},
  author={Forrest N. Iandola and Song Han and Matthew W. Moskewicz and Khalid Ashraf and William J. Dally and Kurt Keutzer},
  year={2016}
}
Recent research on deep convolutional neural networks (CNNs) has focused primarily on improving accuracy. For a given accuracy level, it is typically possible to identify multiple CNN architectures that achieve that accuracy level. With equivalent accuracy, smaller CNN architectures offer at least three advantages: (1) Smaller CNNs require less communication across servers during distributed training. (2) Smaller CNNs require less bandwidth to export a new model from the cloud to an autonomous… CONTINUE READING

Figures and Tables from this paper.

Citations

Publications citing this paper.
SHOWING 1-10 OF 659 CITATIONS

Benchmarking TPU, GPU, and CPU Platforms for Deep Learning

  • ArXiv
  • 2019
VIEW 11 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Presentation Attack Detection Using a Tiny Fully Convolutional Network

  • IEEE Transactions on Information Forensics and Security
  • 2019
VIEW 4 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Recent progresses on object detection: a brief review

  • Multimedia Tools and Applications
  • 2019
VIEW 7 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Reflecting After Learning for Understanding

VIEW 5 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Spoofing and Anti-Spoofing with Wax Figure Faces

  • ArXiv
  • 2019
VIEW 8 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 144 Highly Influenced Citations

  • Averaged 218 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 39 REFERENCES

Zynqnet: An fpga-accelerated embedded convolutional neural network

David Gschwend
  • Master’s thesis, Swiss Federal Institute of Technology Zurich (ETH-Zurich),
  • 2016
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

Deep compression: Compressing DNNs with pruning, trained quantization and huffman coding. arxiv:1510.00149v3, 2015a

S. Han, H. Mao, W. Dally
  • 2015
VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL

Backpropagation Applied to Handwritten Zip Code Recognition

  • Neural Computation
  • 1989
VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Keras: Deep learning library for theano and tensorflow

Francois Chollet
  • https://keras.io,
  • 2016
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL