Learning both Weights and Connections for Efficient Neural Networks

@inproceedings{Han2015LearningBW,
  title={Learning both Weights and Connections for Efficient Neural Networks},
  author={Song Han and Jeff Pool and John Tran and William J. Dally},
  booktitle={NIPS},
  year={2015}
}
Neural networks are both computationally intensive and memory intensive, making them difficult to deploy on embedded systems. Also, conventional networks fix the architecture before training starts; as a result, training cannot improve the architecture. To address these limitations, we describe a method to reduce the storage and computation required by neural networks by an order of magnitude without affecting their accuracy by learning only the important connections. Our method prunes… CONTINUE READING
Highly Influential
This paper has highly influenced 129 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 1,128 citations. REVIEW CITATIONS
Recent Discussions
This paper has been referenced on Twitter 27 times over the past 90 days. VIEW TWEETS

Citations

Publications citing this paper.
Showing 1-10 of 761 extracted citations

1,129 Citations

02004006002015201620172018
Citations per Year
Semantic Scholar estimates that this publication has 1,129 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 33 references

Similar Papers

Loading similar papers…