Low rank approximation and regression in input sparsity time

  title={Low rank approximation and regression in input sparsity time},
  author={K. Clarkson and D. Woodruff},
  booktitle={STOC '13},
  • K. Clarkson, D. Woodruff
  • Published in STOC '13 2013
  • Mathematics, Computer Science
  • We design a new distribution over poly(r ε<sup>-1</sup>) x n matrices S so that for any fixed n x d matrix A of rank r, with probability at least 9/10, SAx<sub>2</sub> = (1 pm ε)Ax<sub>2</sub> simultaneously for all x ∈ R<sup>d</sup>. Such a matrix S is called a <i>subspace embedding</i>. Furthermore, SA can be computed in O(nnz(A)) + ~O(r<sup>2</sup>ε<sup>-2</sup>) time, where nnz(A) is the number of non-zero entries of A. This improves over all previous subspace embeddings, which required at… CONTINUE READING
    256 Citations
    OSNAP: Faster Numerical Linear Algebra Algorithms via Sparser Subspace Embeddings
    • J. Nelson, Huy L. Nguyen
    • Mathematics, Computer Science
    • 2013 IEEE 54th Annual Symposium on Foundations of Computer Science
    • 2013
    • 246
    • PDF
    Low-Rank PSD Approximation in Input-Sparsity Time
    • 22
    • PDF
    Subspace Embeddings and \(\ell_p\)-Regression Using Exponential Random Variables
    • 52
    • PDF
    Tighter Low-rank Approximation via Sampling the Leveraged Element
    • 32
    • PDF
    D S ] 6 A pr 2 01 8 Tight Bounds for l p Oblivious Subspace Embeddings
    • 1
    Empirical Performance of Approximate Algorithms for Low Rank Approximation
    • Highly Influenced
    • PDF
    Optimal CUR matrix decompositions
    • 108
    • PDF
    Weighted low rank approximations with provable guarantees
    • 51
    • PDF