Corpus ID: 51907057

Mixture Matrix Completion

  title={Mixture Matrix Completion},
  author={Daniel L. Pimentel-Alarc{\'o}n},
Completing a data matrix X has become an ubiquitous problem in modern data science, with motivations in recommender systems, computer vision, and networks inference, to name a few. One typical assumption is that X is low-rank. A more general model assumes that each column of X corresponds to one of several low-rank matrices. This paper generalizes these models to what we call mixture matrix completion (MMC): the case where each entry of X corresponds to one of several low-rank matrices. MMC is… Expand
Learning Mixtures of Low-Rank ModelsCorresponding author: Yuxin Chen (email:
We study the problem of learning mixtures of low-rank models, i.e. reconstructing multiple low-rank matrices from unlabelled linear measurements of each. This problem enriches two widely studiedExpand
Motivated by metagenomics, recommender systems, dictionary learning, and related problems, this paper introduces subspace splitting (SS): the task of clustering the entries of what we call aExpand
Recovery of Subspace Structure from High-Rank Data with Missing Entries
This work proposes a method to reconstruct and cluster incomplete high-dimensional data lying in a union of low-dimensional subspaces, exploring the sparse representation model and proposing an algorithm robust to initialization. Expand
Finiteness of fibers in matrix completion via Plücker coordinates
Let $\Omega \subseteq \{1,\dots,m\} \times \{1,\dots,n\}$. We consider fibers of coordinate projections $\pi_\Omega : \mathscr{M}_k(r,m \times n) \rightarrow k^{\# \Omega}$ from the algebraic varietyExpand
Learning Mixtures of Low-Rank Models
This work develops a three-stage meta-algorithm that is guaranteed to recover the unknown matrices with near-optimal sample and computational complexities under Gaussian designs and is provably stable against random noise. Expand
Target Localization With Jammer Removal Using Frequency Diverse Array
A novel “low-rank + low- rank + sparse” decomposition model is employed to extract the low-rank desired signal and suppress the jamming signals from both barrage and burst jammers in a mixed jamming signal model. Expand


A converse to low-rank matrix completion
Conditions on X (genericity) and a deterministic condition on Ω to guarantee that if there is a rank-r matrix that agrees with the observed entries, then X is indeedRank-r, and this condition is satisfied with high probability under uniform random sampling schemes with only O(max{r, log d}) samples per column. Expand
High-Rank Matrix Completion and Subspace Clustering with Missing Data
Under mild assumptions each column of X can be perfectly recovered with high probability from an incomplete version so long as at least CrNlog^2(n) entries of X are observed uniformly at random, with C>1 a constant depending on the usual incoherence conditions. Expand
High-Rank Matrix Completion and Clustering under Self-Expressive Models
This work proposes efficient algorithms for simultaneous clustering and completion of incomplete high-dimensional data that lie in a union of low-dimensional subspaces and shows that when the data matrix is low-rank, the algorithm performs on par with or better than low-Rank matrix completion methods, while for high-rank data matrices, the method significantly outperforms existing algorithms. Expand
The Power of Convex Relaxation: Near-Optimal Matrix Completion
  • E. Candès, T. Tao
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2010
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). Expand
Low-rank matrix completion using alternating minimization
This paper presents one of the first theoretical analyses of the performance of alternating minimization for matrix completion, and the related problem of matrix sensing, and shows that alternating minimizations guarantees faster convergence to the true matrix, while allowing a significantly simpler analysis. Expand
Coherent Matrix Completion
It is shown that nuclear norm minimization can recover an arbitrary n×n matrix of rank r from O(nr log2(n)) revealed entries, provided that revealed entries are drawn proportionally to the local row and column coherences of the underlying matrix. Expand
A characterization of deterministic sampling patterns for low-rank matrix completion
A characterization of finitely completable observation sets is used to derive sufficient deterministic sampling conditions for unique completability and it is shown that under uniform random sampling schemes, these conditions are satisfied with high probability if O(max{r,logd}) entries per column are observed. Expand
Incoherence-Optimal Matrix Completion
  • Yudong Chen
  • Mathematics, Computer Science
  • IEEE Transactions on Information Theory
  • 2015
The results show that the standard and joint incoherence conditions are associated, respectively, with the information (statistical) and computational aspects of the matrix decomposition problem. Expand
Algebraic Variety Models for High-Rank Matrix Completion
This work considers a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i.e. each data point is a solution to a system of polynomial equations, and proposes an efficient matrix completion algorithm that minimizes a convex or non-convex surrogate of the rank of the matrix of monomial features. Expand
Fast Convex Optimization Algorithms for Exact Recovery of a Corrupted Low-Rank Matrix
This paper studies algorithms for solving the problem of recovering a low-rank matrix with a fraction of its entries arbitrarily corrupted. This problem can be viewed as a robust version of classicalExpand