Corpus ID: 193956

Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks

@inproceedings{Safran2017DepthWidthTI,
  title={Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks},
  author={Itay Safran and O. Shamir},
  booktitle={ICML},
  year={2017}
}
  • Itay Safran, O. Shamir
  • Published in ICML 2017
  • Computer Science, Mathematics
  • We provide several new depth-based separation results for feed-forward neural networks, proving that various types of simple and natural functions can be better approximated using deeper networks than shallower ones, even if the shallower networks are much larger. This includes indicators of balls and ellipses; non-linear functions which are radial with respect to the $L_1$ norm; and smooth non-linear functions. We also show that these gaps can be observed experimentally: Increasing the depth… CONTINUE READING
    A lattice-based approach to the expressivity of deep ReLU neural networks
    • 1
    • Open Access
    Neural Networks with Small Weights and Depth-Separation Barriers
    • 1
    • Open Access
    Function approximation by deep networks
    • 6
    • Open Access
    Nearly-tight VC-dimension bounds for piecewise linear neural networks
    • 87
    • Open Access
    Depth Separations in Neural Networks: What is Actually Being Separated?
    • 5
    • Open Access
    Optimal approximation of piecewise smooth functions using deep ReLU neural networks
    • 125
    • Highly Influenced
    • Open Access
    Is Deeper Better only when Shallow is Good?
    • 9
    • Open Access
    Nearly-tight VC-dimension and Pseudodimension Bounds for Piecewise Linear Neural Networks
    • 86
    • Open Access

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 15 REFERENCES
    The Power of Depth for Feedforward Neural Networks
    • 380
    • Open Access
    On the complexity of shallow and deep neural network classifiers
    • 37
    • Open Access
    Error bounds for approximations with deep ReLU networks
    • 281
    • Open Access
    Provable approximation properties for deep neural networks
    • 113
    • Open Access
    Why Deep Neural Networks for Function Approximation?
    • 163
    • Highly Influential
    • Open Access
    Why Deep Neural Networks?
    • 25
    • Open Access
    Benefits of Depth in Neural Networks
    • 301
    • Open Access
    Deep Residual Learning for Image Recognition
    • 49,732
    • Open Access
    Shallow vs. Deep Sum-Product Networks
    • 229
    • Open Access
    On the Expressive Power of Deep Learning: A Tensor Analysis
    • 215