Consistent Collective Matrix Completion under Joint Low Rank Structure
@article{Gunasekar2015ConsistentCM, title={Consistent Collective Matrix Completion under Joint Low Rank Structure}, author={Suriya Gunasekar and Makoto Yamada and Dawei Yin and Yi Chang}, journal={ArXiv}, year={2015}, volume={abs/1412.2113} }
We address the collective matrix completion problem of jointly recovering a collection of matrices with shared structure from partial (and potentially noisy) observations. To ensure well--posedness of the problem, we impose a joint low rank structure, wherein each component matrix is low rank and the latent space of the low rank factors corresponding to each entity is shared across the entire collection. We first develop a rigorous algebra for representing and manipulating collective--matrix…
25 Citations
A PAC bound for joint matrix completion based on Partially Collective Matrix Factorization
- Computer Science2016 23rd International Conference on Pattern Recognition (ICPR)
- 2016
A first PAC generalization error bound for joint matrix completion based on the Partially Collective Matrix Factorization model is presented, which not only justifies the theoretical soundness of P-CMF, but also reveals its several insights.
Collective Matrix Completion
- Computer Science, Mathematics
- 2021
This work considers the problem of collective matrix completion with multiple and heterogeneous matrices, which can be count, binary, continuous, etc, and investigates the setting where, for each source, the matrix entries are sampled from an exponential family distribution.
Collective Matrix Completion
- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2019
This work investigates the setting where, for each source, the matrix entries are sampled from an exponential family distribution and investigates the distribution-free case, and proves that the proposed estimators achieve fast rates of convergence under the two considered settings.
Robust Matrix Completion with Mixed Data Types
- Computer ScienceArXiv
- 2020
This work proposes a computationally feasible statistical approach with strong recovery guarantees along with an algorithmic framework suited for parallelization to recover a low rank matrix with partially observed entries for mixed data types in one step.
Co-Regularized Collective Matrix Factorization for Joint Matrix Completion
- Computer Science
- 2016
This paper introduces a novel joint matrix completion method based on a relaxed assumption that allows the matrix structures to be different, but assume their induced subspaces lie close to each other, and proposes a method that penalizes the distance between these subspaced while learning different factorization models for different matrices.
Partial Collective Matrix Factorization and its PAC Bound
- Computer ScienceISAIM
- 2016
This paper promoted a prior solution to the theoretical level and formalized an assumption-free factorization model called partial collective matrix factorization (pCMF), based on the fact that any two matrices (of the same row) admit some joint factorization such that their factors are partially shared.
Towards a Theoretical Understanding of Negative Transfer in Collective Matrix Factorization
- Computer ScienceUAI
- 2016
Under the statistical mini-max framework, lower bounds for the CMF estimator are derived and two insights are gained that suggest n.t. may be more effectively addressed via model construction other than model selection.
Convex Coupled Matrix and Tensor Completion
- Computer ScienceNeural Computation
- 2018
A set of convex low-rank inducing norms for coupled matrices and tensors are proposed, which can be used to find a globally optimal solution, whereas existing methods for coupled learning are nonconvex.
Convex Factorization Machine for Regression
- Computer ScienceArXiv
- 2015
The convex factorization machine (CFM) is proposed, which is a convex variant of the widely used Factorization Machines (FMs), and it is shown that CFM outperforms a state-of-the-art tensor factorization method in a toxicogenomics prediction task.
Convex Factorization Machine for Toxicogenomics Prediction
- Computer ScienceKDD
- 2017
It is shown in a toxicogenomics prediction task that CFM predicts the toxic outcomes of a collection of drugs better than a state-of-the-art tensor factorization method.
References
SHOWING 1-10 OF 45 REFERENCES
Exponential Family Matrix Completion under Structural Constraints
- Computer ScienceICML
- 2014
A vastly unified framework for generalized matrix completion is provided by considering a matrix completion setting wherein the matrix entries are sampled from any member of the rich family of exponential family distributions; and impose general structural constraints on the underlying matrix, as captured by a general regularizer R(.).
Coherent Matrix Completion
- Computer ScienceICML
- 2014
It is shown that nuclear norm minimization can recover an arbitrary n×n matrix of rank r from O(nr log2(n)) revealed entries, provided that revealed entries are drawn proportionally to the local row and column coherences of the underlying matrix.
The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Computer ScienceIEEE Transactions on Information Theory
- 2010
This paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors).
Low-rank matrix completion using alternating minimization
- Computer ScienceSTOC '13
- 2013
This paper presents one of the first theoretical analyses of the performance of alternating minimization for matrix completion, and the related problem of matrix sensing, and shows that alternating minimizations guarantees faster convergence to the true matrix, while allowing a significantly simpler analysis.
Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2012
The matrix completion problem under a form of row/column weighted entrywise sampling is considered, including the case of uniformentrywise sampling as a special case, and it is proved that with high probability, it satisfies a forms of restricted strong convexity with respect to weighted Frobenius norm.
Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Computer Science, MathematicsSIAM Rev.
- 2010
It is shown that if a certain restricted isometry property holds for the linear transformation defining the constraints, the minimum-rank solution can be recovered by solving a convex optimization problem, namely, the minimization of the nuclear norm over the given affine space.
Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Computer ScienceIEEE Transactions on Information Theory
- 2011
It is shown that an unknown matrix of rank can be efficiently reconstructed from only randomly sampled expansion coefficients with respect to any given matrix basis, which quantifies the “degree of incoherence” between the unknown matrix and the basis.
A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers
- Computer Science, MathematicsNIPS
- 2009
A unified framework for establishing consistency and convergence rates for regularized M-estimators under high-dimensional scaling is provided and one main theorem is state and shown how it can be used to re-derive several existing results, and also to obtain several new results.
Exact matrix completion via convex optimization
- Computer ScienceCACM
- 2012
It is demonstrated that in very general settings, one can perfectly recover all of the missing entries from most sufficiently large subsets by solving a convex programming problem that finds the matrix with the minimum nuclear norm agreeing with the observed entries.
Matrix Completion With Noise
- Computer ScienceProceedings of the IEEE
- 2010
This paper surveys the novel literature on matrix completion and introduces novel results showing that matrix completion is provably accurate even when the few observed entries are corrupted with a small amount of noise, and shows that, in practice, nuclear-norm minimization accurately fills in the many missing entries of large low-rank matrices from just a few noisy samples.