Memory-sample tradeoffs for linear regression with small error

@article{Sharan2019MemorysampleTF,
  title={Memory-sample tradeoffs for linear regression with small error},
  author={Vatsal Sharan and Aaron Sidford and G. Valiant},
  journal={Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing},
  year={2019}
}
  • Vatsal Sharan, Aaron Sidford, G. Valiant
  • Published 2019
  • Computer Science, Mathematics
  • Proceedings of the 51st Annual ACM SIGACT Symposium on Theory of Computing
  • We consider the problem of performing linear regression over a stream of d-dimensional examples, and show that any algorithm that uses a subquadratic amount of memory exhibits a slower rate of convergence than can be achieved without memory constraints. Specifically, consider a sequence of labeled examples (a1,b1), (a2,b2)…, with ai drawn independently from a d-dimensional isotropic Gaussian, and where bi = ⟨ ai, x⟩ + ηi, for a fixed x ∈ ℝd with ||x||2 = 1 and with independent noise ηi drawn… CONTINUE READING
    Space lower bounds for linear prediction
    4
    Space lower bounds for linear prediction in the streaming model
    3
    The Random-Query Model and the Memory-Bounded Coupon Collector
    1
    Towards a combinatorial characterization of bounded memory learning

    References

    Publications referenced by this paper.
    SHOWING 1-4 OF 4 REFERENCES
    Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning
    • R. Raz
    • Mathematics, Computer Science
    • 2016
    28
    A Time-Space Lower Bound for a Large Class of Learning Problems
    • R. Raz
    • Computer Science, Mathematics
    • 2017
    32
    A Concentration Theorem for Projections
    39
    A Randomized Solver for Linear Systems with Exponential Convergence
    51