Tensor Methods for Nonlinear Matrix Completion

@article{Ongie2021TensorMF,
  title={Tensor Methods for Nonlinear Matrix Completion},
  author={Greg Ongie and Laura Balzano and Daniel L. Pimentel-Alarc{\'o}n and Rebecca M. Willett and Robert D. Nowak},
  journal={SIAM J. Math. Data Sci.},
  year={2021},
  volume={3},
  pages={253-279}
}
In the low-rank matrix completion (LRMC) problem, the low-rank assumption means that the columns (or rows) of the matrix to be completed are points on a low-dimensional linear algebraic variety. This paper extends this thinking to cases where the columns are points on a low-dimensional nonlinear algebraic variety, a problem we call Low Algebraic Dimension Matrix Completion (LADMC). Matrices whose columns belong to a union of subspaces are an important special case. We propose a LADMC algorithm… Expand
Local low-rank approach to nonlinear matrix completion
TLDR
This paper proposes a new method for obtaining the solution by minimizing the rank of the submatrix without transforming the target matrix, so as to obtain high estimation accuracy even when the number of columns is small. Expand
Local low-rank approach to nonlinear-matrix completion
TLDR
This paper proposes a new method for obtaining the solution by minimizing the rank of the submatrix without transforming the target matrix, so as to obtain high estimation accuracy even when the number of columns is small. Expand
Nonlinear matrix recovery using optimization on the Grassmann manifold
TLDR
This work investigates the problem of recovering a partially observed high-rank matrix whose columns obey a nonlinear structure such as a union of subspaces, an algebraic variety or grouped in clusters and proposes two sets of algorithms, one arising from Riemannian optimization and the other as an alternating minimization scheme. Expand
Polynomial Matrix Completion for Missing Data Imputation and Transductive Learning
TLDR
It is shown that the complete matrix of minimum intrinsic dimension can be identified by minimizing the rank of the matrix in a high dimensional feature space by developing a new formulation using the kernel trick together with a new relaxation of the rank objective. Expand
Classifying and Comparing Approaches to Subspace Clustering with Missing Data
TLDR
This work organizes the existing methods for subspace clustering with missing data into five distinct families and discusses their relative strengths and weaknesses, and exposes some gaps in the current literature. Expand
Unsigned Matrix Completion
  • Yunzhen Yao, Liangzu Peng, M. Tsakiris
  • Computer Science
  • 2021 IEEE International Symposium on Information Theory (ISIT)
  • 2021
TLDR
Under an algebraic geometry framework, this work provides fundamental results regarding unique recovery up to a class of rank-preserving sign patterns and finite completability in the problem of unsigned matrix retrieval. Expand
Non-intrusive reduced-order models for parametric partial differential equations via data-driven operator inference
TLDR
This work formulates a new approach to reduced modeling of parameterized, timedependent partial differential equations (PDEs) using Operator Inference, a scientific machine learning framework combining data-driven learning and physics-based modeling that can be solved rapidly to map parameter values to approximate PDE solutions. Expand
Efficient Map Prediction via Low-Rank Matrix Completion
  • Zheng Chen, Shi Bai, Lantao Liu
  • Computer Science
  • 2021 IEEE International Conference on Robotics and Automation (ICRA)
  • 2021
TLDR
It is demonstrated that with the proposed real-time map prediction framework, the coverage convergence rate (per action step) for a set of representative coverage planning methods commonly used for environmental modeling and monitoring tasks can be significantly improved. Expand
Learning from Comparisons and Choices
TLDR
A convex relaxation is proposed for learning the MultiNomial Logit (MNL) model, and it is shown that it is minimax optimal up to a logarithmic factor by comparing its performance to a fundamental lower bound. Expand

References

SHOWING 1-10 OF 89 REFERENCES
Low algebraic dimension matrix completion
TLDR
A new LADMC algorithm is proposed that leverages existing LRMC methods on a tensorized representation of the data and can succeed in many cases where traditional LRMC is guaranteed to fail. Expand
Algebraic Variety Models for High-Rank Matrix Completion
TLDR
This work considers a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i.e. each data point is a solution to a system of polynomial equations, and proposes an efficient matrix completion algorithm that minimizes a convex or non-convex surrogate of the rank of the matrix of monomial features. Expand
Tensor completion and low-n-rank tensor recovery via convex optimization
In this paper we consider sparsity on a tensor level, as given by the n-rank of a tensor. In an important sparse-vector approximation problem (compressed sensing) and the low-rank matrix recoveryExpand
High-Rank Matrix Completion and Subspace Clustering with Missing Data
TLDR
Under mild assumptions each column of X can be perfectly recovered with high probability from an incomplete version so long as at least CrNlog^2(n) entries of X are observed uniformly at random, with C>1 a constant depending on the usual incoherence conditions. Expand
Online High Rank Matrix Completion
  • Jicong Fan, Madeleine Udell
  • Computer Science, Mathematics
  • 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • 2019
TLDR
A new model for high rank matrix completion (HRMC), together with batch and online methods to fit the model and out-of-sample extension to complete new data, and enjoys much lower space and time complexity than previous methods for HRMC. Expand
Polynomial Matrix Completion for Missing Data Imputation and Transductive Learning
TLDR
It is shown that the complete matrix of minimum intrinsic dimension can be identified by minimizing the rank of the matrix in a high dimensional feature space by developing a new formulation using the kernel trick together with a new relaxation of the rank objective. Expand
A converse to low-rank matrix completion
TLDR
Conditions on X (genericity) and a deterministic condition on Ω to guarantee that if there is a rank-r matrix that agrees with the observed entries, then X is indeedRank-r, and this condition is satisfied with high probability under uniform random sampling schemes with only O(max{r, log d}) samples per column. Expand
Guaranteed Rank Minimization via Singular Value Projection
TLDR
Results show that the SVP-Newton method is significantly robust to noise and performs impressively on a more realistic power-law sampling scheme for the matrix completion problem. Expand
High-Rank Matrix Completion and Clustering under Self-Expressive Models
TLDR
This work proposes efficient algorithms for simultaneous clustering and completion of incomplete high-dimensional data that lie in a union of low-dimensional subspaces and shows that when the data matrix is low-rank, the algorithm performs on par with or better than low-Rank matrix completion methods, while for high-rank data matrices, the method significantly outperforms existing algorithms. Expand
On the Best Rank-1 Approximation of Higher-Order Supersymmetric Tensors
TLDR
It is shown that a symmetric version of the above method converges under assumptions of convexity (or concavity) for the functional induced by the tensor in question, assumptions that are very often satisfied in practical applications. Expand
...
1
2
3
4
5
...