Corpus ID: 17712724

Diversity Networks: Neural Network Compression Using Determinantal Point Processes

@article{Mariet2015DiversityNN,
  title={Diversity Networks: Neural Network Compression Using Determinantal Point Processes},
  author={Zelda Mariet and Suvrit Sra},
  journal={arXiv: Learning},
  year={2015}
}
  • Zelda Mariet, Suvrit Sra
  • Published 2015
  • Mathematics, Computer Science
  • arXiv: Learning
  • We introduce Divnet, a flexible technique for learning networks with diverse neurons. Divnet models neuronal diversity by placing a Determinantal Point Process (DPP) over neurons in a given layer. It uses this DPP to select a subset of diverse neurons and subsequently fuses the redundant neurons into the selected ones. Compared with previous approaches, Divnet offers a more principled, flexible technique for capturing neuronal diversity and thus implicitly enforcing regularization. This enables… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Paper Mentions

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 29 CITATIONS, ESTIMATED 94% COVERAGE

    DPPNet: Approximating Determinantal Point Processes with Deep Networks

    VIEW 1 EXCERPT
    CITES BACKGROUND

    Channel Pruning for Accelerating Very Deep Neural Networks

    • Yihui He, Xiangyu Zhang, Jian Sun
    • Computer Science
    • 2017 IEEE International Conference on Computer Vision (ICCV)
    • 2017

    Pruning the Convolution Neural Network (SqueezeNet) based on L2 Normalization of Activation Maps

    VIEW 1 EXCERPT
    CITES METHODS

    Pruning filters with L1-norm and standard deviation for CNN compression

    VIEW 1 EXCERPT
    CITES BACKGROUND

    FILTER CITATIONS BY YEAR

    2016
    2020

    CITATION STATISTICS

    • 2 Highly Influenced Citations

    • Averaged 8 Citations per year from 2018 through 2020

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 28 REFERENCES

    Data-free Parameter Pruning for Deep Neural Networks

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Reshaping deep neural network for fast decoding by node-pruning

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Determinantal Point Processes for Machine Learning

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL