Relaxed leverage sampling for low-rank matrix completion

@article{Kundu2017RelaxedLS,
  title={Relaxed leverage sampling for low-rank matrix completion},
  author={Abhisek Kundu},
  journal={Inf. Process. Lett.},
  year={2017},
  volume={124},
  pages={6-9}
}
  • Abhisek Kundu
  • Published 2017
  • Mathematics, Computer Science
  • Inf. Process. Lett.
We consider the problem of exact recovery of any $m\times n$ matrix of rank $\varrho$ from a small number of observed entries via the standard nuclear norm minimization framework. Such low-rank matrices have degrees of freedom $(m+n)\varrho - \varrho^2$. We show that any arbitrary low-rank matrices can be recovered exactly from a $\Theta\left(((m+n)\varrho - \varrho^2)\log^2(m+n)\right)$ randomly sampled entries, thus matching the lower bound on the required number of entries (in terms of… Expand
Network Latency Estimation With Leverage Sampling for Personal Devices: An Adaptive Tensor Completion Approach
TLDR
The results show that the proposed adaptive sampling scheme is able to not only improve the estimation accuracy of network latency but also reduce the sample budget compared to the state-of-the-art approaches. Expand
An Adaptive Leverage Sampling Scheme for Fingerprint-based Indoor Localization
TLDR
A two-pass adaptive sampling scheme where under a given budget of samples, a fixed proportion of samples are firstly allocated uniformly at random, and then the remaining samples are assigned according to the leverage scores of the underlying tensor to mitigate the sample complexity for offline site survey. Expand

References

SHOWING 1-10 OF 16 REFERENCES
Completing any low-rank matrix, provably
TLDR
It is shown that any low-rank matrix can be exactly recovered from as few as $O(nr \log^2 n)$ randomly chosen elements, provided this random choice is made according to a {\em specific biased distribution}: the probability of any element being sampled should be proportional to the sum of the leverage scores of the corresponding row, and column. Expand
Coherent Matrix Completion
TLDR
It is shown that nuclear norm minimization can recover an arbitrary n×n matrix of rank r from O(nr log2(n)) revealed entries, provided that revealed entries are drawn proportionally to the local row and column coherences of the underlying matrix. Expand
The Power of Convex Relaxation: Near-Optimal Matrix Completion
  • E. Candès, T. Tao
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2010
TLDR
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). Expand
Relative-Error CUR Matrix Decompositions
TLDR
These two algorithms are the first polynomial time algorithms for such low-rank matrix approximations that come with relative-error guarantees; previously, in some cases, it was not even known whether such matrix decompositions exist. Expand
Low-Rank Matrix and Tensor Completion via Adaptive Sampling
TLDR
In the absence of noise, it is shown that one can exactly recover a n x n matrix of rank r from merely Ω(nr3/2 log(r)) matrix entries, and one can recover an order T tensor using Ω (nrT-1/2T2 log (r)) entries. Expand
Recovering Low-Rank Matrices From Few Coefficients in Any Basis
  • D. Gross
  • Computer Science, Mathematics
  • IEEE Transactions on Information Theory
  • 2011
TLDR
It is shown that an unknown matrix of rank can be efficiently reconstructed from only randomly sampled expansion coefficients with respect to any given matrix basis, which quantifies the “degree of incoherence” between the unknown matrix and the basis. Expand
Sparse Approximate Solutions to Linear Systems
  • B. Natarajan
  • Mathematics, Computer Science
  • SIAM J. Comput.
  • 1995
The following problem is considered: given a matrix $A$ in ${\bf R}^{m \times n}$, ($m$ rows and $n$ columns), a vector $b$ in ${\bf R}^m$, and ${\bf \epsilon} > 0$, compute a vector $x$ satisfyingExpand
A Simpler Approach to Matrix Completion
  • B. Recht
  • Mathematics, Computer Science
  • J. Mach. Learn. Res.
  • 2011
TLDR
This paper provides the best bounds to date on the number of randomly sampled entries required to reconstruct an unknown low-rank matrix by minimizing the nuclear norm of the hidden matrix subject to agreement with the provided entries. Expand
Exact matrix completion via convex optimization
TLDR
It is demonstrated that in very general settings, one can perfectly recover all of the missing entries from most sufficiently large subsets by solving a convex programming problem that finds the matrix with the minimum nuclear norm agreeing with the observed entries. Expand
Robust principal component analysis?
TLDR
It is proved that under some suitable assumptions, it is possible to recover both the low-rank and the sparse components exactly by solving a very convenient convex program called Principal Component Pursuit; among all feasible decompositions, this suggests the possibility of a principled approach to robust principal component analysis. Expand
...
1
2
...