# Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks

@inproceedings{Safran2017DepthWidthTI, title={Depth-Width Tradeoffs in Approximating Natural Functions with Neural Networks}, author={Itay Safran and O. Shamir}, booktitle={ICML}, year={2017} }

We provide several new depth-based separation results for feed-forward neural networks, proving that various types of simple and natural functions can be better approximated using deeper networks than shallower ones, even if the shallower networks are much larger. This includes indicators of balls and ellipses; non-linear functions which are radial with respect to the $L_1$ norm; and smooth non-linear functions. We also show that these gaps can be observed experimentally: Increasing the depth… CONTINUE READING

#### Supplemental Video

78 Citations

A lattice-based approach to the expressivity of deep ReLU neural networks

- Mathematics, Computer Science
- 2019

- 1
- Open Access

Neural Networks with Small Weights and Depth-Separation Barriers

- Computer Science, Mathematics
- 2020

- 1
- Open Access

Nearly-tight VC-dimension bounds for piecewise linear neural networks

- Mathematics, Computer Science
- 2017

- 87
- Open Access

Depth Separations in Neural Networks: What is Actually Being Separated?

- Computer Science, Mathematics
- 2019

- 5
- Open Access

Optimal approximation of piecewise smooth functions using deep ReLU neural networks

- Mathematics, Computer Science
- 2018

- 125
- Highly Influenced
- Open Access

Nearly-tight VC-dimension and Pseudodimension Bounds for Piecewise Linear Neural Networks

- Computer Science, Mathematics
- 2019

- 86
- Open Access

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 15 REFERENCES

Provable approximation properties for deep neural networks

- Mathematics, Computer Science
- 2015

- 113
- Open Access

Why Deep Neural Networks for Function Approximation?

- Computer Science, Mathematics
- 2017

- 163
- Highly Influential
- Open Access