Corpus ID: 53388625

The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks

@article{Frankle2019TheLT,
  title={The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks},
  author={Jonathan Frankle and Michael Carbin},
  journal={arXiv: Learning},
  year={2019}
}
  • Jonathan Frankle, Michael Carbin
  • Published 2019
  • Mathematics, Computer Science
  • arXiv: Learning
  • Neural network pruning techniques can reduce the parameter counts of trained networks by over 90%, decreasing storage requirements and improving computational performance of inference without compromising accuracy. [...] Key MethodWe present an algorithm to identify winning tickets and a series of experiments that support the lottery ticket hypothesis and the importance of these fortuitous initializations. We consistently find winning tickets that are less than 10-20% of the size of several fully-connected and…Expand Abstract

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 344 CITATIONS

    The Lottery Ticket Hypothesis at Scale

    VIEW 3 EXCERPTS
    CITES BACKGROUND & RESULTS

    Rigging the Lottery: Making All Tickets Winners

    VIEW 3 EXCERPTS
    CITES BACKGROUND
    HIGHLY INFLUENCED

    Sparse Transfer Learning via Winning Lottery Tickets

    VIEW 18 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS
    HIGHLY INFLUENCED

    Drawing early-bird tickets: Towards more efficient training of deep networks

    VIEW 9 EXCERPTS
    CITES BACKGROUND, METHODS & RESULTS
    HIGHLY INFLUENCED

    FICIENT TRAINING OF DEEP NETWORKS

    VIEW 9 EXCERPTS
    HIGHLY INFLUENCED

    Using Winning Lottery Tickets in Transfer Learning for Convolutional Neural Networks

    VIEW 9 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Calibrate and Prune: Improving Reliability of Lottery Tickets Through Prediction Calibration

    VIEW 10 EXCERPTS
    CITES METHODS, BACKGROUND & RESULTS
    HIGHLY INFLUENCED

    Deconstructing Lottery Tickets: Zeros, Signs, and the Supermask

    VIEW 11 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Evaluating Lottery Tickets Under Distributional Shifts

    VIEW 10 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2017
    2020

    CITATION STATISTICS

    • 60 Highly Influenced Citations

    • Averaged 114 Citations per year from 2018 through 2020

    • 15% Increase in citations per year in 2020 over 2019

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 72 REFERENCES

    Understanding the difficulty of training deep feedforward neural networks

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Learning Sparse Neural Networks through L0 Regularization

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL