# Fast Learning Requires Good Memory

@article{Raz2019FastLR, title={Fast Learning Requires Good Memory}, author={Ran Raz}, journal={Journal of the ACM (JACM)}, year={2019}, volume={66}, pages={1 - 18} }

We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt et al. (2016) and shows that for some learning problems, a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x ∈ {0,1}n was chosen uniformly at random. A learner tries to learn x from a stream of samples (a1, b1), (a2, b2) …, where each at is uniformly distributed over {0…

## 3 Citations

A Time-Space Lower Bound for a Large Class of Learning Problems

- Mathematics, Computer Science2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS)
- 2017

We prove a general time-space lower bound that applies for a large class of learning problems and shows that for every problem in that class, any learning algorithm requires either a memory of…

Tight Time-Space Lower Bounds for Finding Multiple Collision Pairs and Their Applications

- Computer ScienceIACR Cryptol. ePrint Arch.
- 2020

We consider a collision search problem (CSP), where given a parameter C, the goal is to find C collision pairs in a random function \(f:[N] \rightarrow [N]\) (where \([N] = \{0,1,\ldots ,N-1\})\)…

When is memorization of irrelevant training data necessary for high-accuracy learning?

- Computer ScienceSTOC
- 2021

This paper describes natural prediction problems in which every sufficiently accurate training algorithm must encode essentially all the information about a large subset of its training examples, which remains true even when the examples are high-dimensional and have entropy much higher than the sample size.

## References

SHOWING 1-10 OF 28 REFERENCES

Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

- Computer Science, Mathematics2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)
- 2016

It is proved that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples, and an encryption scheme that requires a private key of length n, as well as time complexity of n per encryption/decryption of each bit is provenly and unconditionally secure.

A Time-Space Lower Bound for a Large Class of Learning Problems

- Mathematics, Computer Science2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS)
- 2017

We prove a general time-space lower bound that applies for a large class of learning problems and shows that for every problem in that class, any learning algorithm requires either a memory of…

Time-space hardness of learning sparse parities

- Computer Science, MathematicsElectron. Colloquium Comput. Complex.
- 2016

It is shown that the class of all sparse parities of Hamming weight ℓ is time-space hard, as long asℓ ≥ ω(logn / loglogn).

Extractor-based time-space lower bounds for learning

- Computer Science, MathematicsElectron. Colloquium Comput. Complex.
- 2017

This work shows that for a large class of learning problems, any learning algorithm requires either a memory of size at least Ω(k · l ), or at least 2Ω(r) samples, or an exponential number of samples, achieving a tight Ω((log|X|) · (log|A|)) lower bound on the size of the memory.

Mixing Implies Lower Bounds for Space Bounded Learning

- MathematicsCOLT
- 2017

It is shown that if an hypothesis class H, when viewed as a bipartite graph between hypotheses H and labeled examples X, is mixing, then learning it requires |H| examples under a certain bound on the memory, which means that most hypothesis classes are unlearnable with bounded memory.

Detecting Correlations with Little Memory and Communication

- Computer ScienceCOLT 2018
- 2018

A tight trade-off between the memory/communication complexity and the sample complexity is proved, implying (for example) that to detect pairwise correlations with optimal sample complexity, the number of required memory/ communication bits is at least quadratic in the dimension.

Time-Space Tradeoffs for Learning Finite Functions from Random Evaluations, with Applications to Polynomials

- Computer Science, MathematicsElectron. Colloquium Comput. Complex.
- 2018

The time-space complexity of learning from random evaluations is reduced to the question of how much the corresponding evaluation matrix amplifies the 2-norms of “almost uniform” probability distributions, and bounds for learning polynomials over finite fields are derived.

Entropy Samplers and Strong Generic Lower Bounds For Space Bounded Learning

- Computer Science, MathematicsITCS
- 2018

It is proved that any hypothesis class whose hypotheses graph is mixing cannot be learned using less than Omega(log^2 |H|) memory bits unless the learner uses at least a large number of labeled examples.

Memory, Communication, and Statistical Queries

- Computer ScienceCOLT
- 2015

A formal framework for studying the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task is introduced, and strong lower bounds on learning parity functions with bounded communication are shown.

Time-space trade-off lower bounds for randomized computation of decision problems

- Computer ScienceJACM
- 2003

We prove the first time-space lower bound trade-offs for randomized computation of decision problems. The bounds hold even in the case that the computation is allowed to have arbitrary probability of…