# Mixing Complexity and its Applications to Neural Networks

@article{Moshkovitz2017MixingCA, title={Mixing Complexity and its Applications to Neural Networks}, author={M. Moshkovitz and Naftali Tishby}, journal={ArXiv}, year={2017}, volume={abs/1703.00729} }

We suggest analyzing neural networks through the prism of space constraints. We observe that most training algorithms applied in practice use bounded memory, which enables us to use a new notion introduced in the study of space-time tradeoffs that we call mixing complexity. This notion was devised in order to measure the (in)ability to learn using a bounded-memory algorithm. In this paper we describe how we use mixing complexity to obtain new results on what can and cannot be learned using… CONTINUE READING

8 Citations

Approximating Continuous Functions by ReLU Nets of Minimal Width

- Computer Science, Mathematics
- 2017

- 73
- PDF

Time-Space Tradeoffs for Distinguishing Distributions and Applications to Security of Goldreich's PRG

- Mathematics, Computer Science
- 2020

- 2
- PDF

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 36 REFERENCES

Understanding deep learning requires rethinking generalization

- Mathematics, Computer Science
- 2017

- 2,101
- Highly Influential
- PDF

Toward Deeper Understanding of Neural Networks: The Power of Initialization and a Dual View on Expressivity

- Computer Science, Mathematics
- 2016

- 158
- PDF