PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions


We propose a novel approach to reduce the computational cost of evaluation of convolutional neural networks, a factor that has hindered their deployment in lowpower devices such as mobile phones. Inspired by the loop perforation technique from source code optimization, we speed up the bottleneck convolutional layers by skipping their evaluation in some of… (More)


9 Figures and Tables


Citations per Year

Citation Velocity: 97

Averaging 97 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@inproceedings{Figurnov2016PerforatedCNNsAT, title={PerforatedCNNs: Acceleration through Elimination of Redundant Convolutions}, author={Mikhail Figurnov and Aizhan Ibraimova and Dmitry P. Vetrov and Pushmeet Kohli}, booktitle={NIPS}, year={2016} }