# Memory-sample tradeoffs for linear regression with small error

@article{Sharan2019MemorysampleTF, title={Memory-sample tradeoffs for linear regression with small error}, author={Vatsal Sharan and Aaron Sidford and G. Valiant}, journal={Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing}, year={2019} }

We consider the problem of performing linear regression over a stream of d-dimensional examples, and show that any algorithm that uses a subquadratic amount of memory exhibits a slower rate of convergence than can be achieved without memory constraints. Specifically, consider a sequence of labeled examples (a1,b1), (a2,b2)…, with ai drawn independently from a d-dimensional isotropic Gaussian, and where bi = ⟨ ai, x⟩ + ηi, for a fixed x ∈ ℝd with ||x||2 = 1 and with independent noise ηi drawn… CONTINUE READING

#### Topics from this paper.

7 Citations

Binary Hypothesis Testing with Deterministic Finite-Memory Decision Rules

- Computer Science, Mathematics
- 2020

Towards a combinatorial characterization of bounded memory learning

- Mathematics, Computer Science
- 2020

Time-Space Tradeoffs for Distinguishing Distributions and Applications to Security of Goldreich's PRG

- Mathematics, Computer Science
- 2020

2

#### References

##### Publications referenced by this paper.

SHOWING 1-4 OF 4 REFERENCES

Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning

- Mathematics, Computer Science
- 2016

28

A Randomized Solver for Linear Systems with Exponential Convergence

- Computer Science, Mathematics
- 2006

51