On the expressive power of deep neural networks

@inproceedings{Raghu2016OnTE,
  title={On the expressive power of deep neural networks},
  author={Maithra Raghu and Ben Poole and Jon M. Kleinberg and Surya Ganguli and Jascha Sohl-Dickstein},
  booktitle={ICML},
  year={2016}
}
We study the expressive power of deep neural networks before and after training. Considering neural nets after random initialization, we show that three natural measures of expressivity all display an exponential dependence on the depth of the network. We prove, theoretically and experimentally, that all of these measures are in fact related to a fourth quantity, trajectory length. This quantity grows exponentially in the depth of the network, and is responsible for the depth sensitivity… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 166 CITATIONS

Deep ReLU Networks Have Surprisingly Few Activation Patterns

VIEW 6 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Learning ReLU Networks on Linearly Separable Data: Algorithm, Optimality, and Generalization

  • IEEE Transactions on Signal Processing
  • 2019
VIEW 15 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Towards Robust, Locally Linear Deep Networks

VIEW 5 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Bounding and Counting Linear Regions of Deep Neural Networks

VIEW 20 EXCERPTS
CITES BACKGROUND, METHODS & RESULTS
HIGHLY INFLUENCED

The Upper Bound on Knots in Neural Networks

  • ArXiv
  • 2016
VIEW 8 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Complexity of Linear Regions in Deep Networks

VIEW 5 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks

VIEW 4 EXCERPTS
CITES RESULTS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2016
2019

CITATION STATISTICS

  • 19 Highly Influenced Citations

  • Averaged 52 Citations per year from 2017 through 2019