Corpus ID: 162184071

Network Pruning via Transformable Architecture Search

@inproceedings{Dong2019NetworkPV,
  title={Network Pruning via Transformable Architecture Search},
  author={Xuanyi Dong and Yi Yang},
  booktitle={NeurIPS},
  year={2019}
}
  • Xuanyi Dong, Yi Yang
  • Published in NeurIPS 2019
  • Computer Science
  • Network pruning reduces the computation costs of an over-parameterized network without performance damage. [...] Key Method The number of the channels/layers is learned by minimizing the loss of the pruned networks. The feature map of the pruned network is an aggregation of K feature map fragments (generated by K networks of different sizes), which are sampled based on the probability distribution.The loss can be back-propagated not only to the network weights, but also to the parameterized distribution to…Expand Abstract

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 20 CITATIONS

    Knapsack Pruning with Inner Distillation

    VIEW 9 EXCERPTS
    CITES METHODS, RESULTS & BACKGROUND
    HIGHLY INFLUENCED

    Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio

    VIEW 4 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    One-Shot Neural Architecture Search via Self-Evaluated Template Network

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Teacher Supervises Students How to Learn From Partially Labeled Images for Facial Landmark Detection

    VIEW 1 EXCERPT
    CITES BACKGROUND

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 51 REFERENCES

    Rethinking the Value of Network Pruning

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Importance Estimation for Neural Network Pruning

    Pruning Filters for Efficient ConvNets

    VIEW 9 EXCERPTS
    HIGHLY INFLUENTIAL

    Efficient Architecture Search by Network Transformation

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Cascaded Projection: End-To-End Network Compression and Acceleration

    VIEW 1 EXCERPT