Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination

@article{Zhao2015BayesianCF,
  title={Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination},
  author={Qibin Zhao and Liqing Zhang and Andrzej Cichocki},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  year={2015},
  volume={37},
  pages={1751-1763}
}
CANDECOMP/PARAFAC (CP) tensor factorization of incomplete data is a powerful technique for tensor completion through explicitly capturing the multilinear latent factors. The existing CP algorithms require the tensor rank to be manually specified, however, the determination of tensor rank remains a challenging problem especially for CP rank . In addition, existing approaches do not take into account uncertainty information of latent factors, as well as missing entries. To address these issues… 
Tensor Rank Estimation and Completion via CP-based Nuclear Norm
TLDR
Tensor Rank Estimation based on $L_1$-regularized orthogonal CP decomposition (TREL1) is proposed, which incorporates a regularization with CP-based tensor nuclear norm when minimizing the reconstruction error in TC to automatically determine the rank of an incomplete tensor.
Towards Probabilistic Tensor Canonical Polyadic Decomposition 2.0: Automatic Tensor Rank Learning Using Generalized Hyperbolic Prior
TLDR
A more advanced generalized hyperbolic (GH) prior is introduced to the probabilistic CPD model, which not only includes the Gaussian-gamma model as a special case, but also provides more flexibilities to adapt to different levels of sparsity.
Bayesian Nonparametric Tensor Completion
In this paper, we propose a Bayesian nonparametric method to estimate missing data in tensors. The proposed method uses a Tucker-1 factorization to learn a smaller core tensor and a factor matrix via
Scalable Bayesian Low-Rank Decomposition of Incomplete Multiway Tensors
TLDR
A scalable Bayesian framework for low-rank decomposition of multiway tensor data with missing observations, which outperforms several state-of-the-art tensor decomposition methods on various synthetic and benchmark real-world datasets.
Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion
TLDR
A class of probabilistic generative Tucker models for tensor decomposition and completion with structural sparsity over multilinear latent space and two group sparsity inducing priors by hierarchial representation of Laplace and Student-t distributions are introduced, which facilitates fully posterior inference.
General-Purpose Bayesian Tensor Learning With Automatic Rank Determination and Uncertainty Quantification
  • Kaiqi Zhang, Cole Hawkins, Zheng Zhang
  • Medicine
    Frontiers in Artificial Intelligence
  • 2021
A major challenge in many machine learning tasks is that the model expressive power depends on model size. Low-rank tensor methods are an efficient tool for handling the curse of dimensionality in
Low-Rank Tensor Completion: A Pseudo-Bayesian Learning Approach
  • Wei Chen, Nan Song
  • Computer Science
    2017 IEEE International Conference on Computer Vision (ICCV)
  • 2017
TLDR
This paper proposes a pseudo-Bayesian approach, where a Bayesian-inspired cost function is adjusted using appropriate approximations that lead to desirable attributes including concavity and symmetry, and proves the ability to recover the true tensor with a low multilinear rank.
Bayesian Low Rank Tensor Ring Model for Image Completion
TLDR
Numerical Experiments show that the proposed approach outperforms state-of-the-art ones, especially in terms of recovery accuracy, and the TR ranks can be obtained by Bayesian inference.
Probabilistic Tensor Canonical Polyadic Decomposition With Orthogonal Factors
TLDR
A novel tensor CPD algorithm based on the probabilistic inference framework is devised, based on which an inference algorithm is proposed that alternatively estimates the factor matrices, recovers the tensor rank, and mitigates the outliers.
A Fused CP Factorization Method for Incomplete Tensors
TLDR
A modified CP tensor factorization framework that fuses the <inline-formula> norm constraint, sparseness, manifold, and smooth information simultaneously, which reveals the characteristics of commonly used regularizations for tensor completion in a certain sense and gives experimental guidance concerning how to use them.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Simultaneous Tensor Decomposition and Completion Using Factor Priors
TLDR
This paper proposes a method called simultaneous tensor decomposition and completion (STDC) that combines a rank minimization technique with Tucker model decomposition, and uses factor priors, which are usually known a priori in real-world tensor objects, to characterize the underlying joint-manifold drawn from the model factors.
Tensor factorization using auxiliary information
TLDR
This paper proposes to use relationships among data as auxiliary information in addition to the low-rank assumption to improve the quality of tensor decomposition, and introduces two regularization approaches using graph Laplacians induced from the relationships.
Scalable Tensor Factorizations for Incomplete Data
TLDR
An algorithm called CP-WOPT (CP Weighted OPTimization) that uses a first-order optimization approach to solve the weighted least squares problem and is shown to successfully factorize tensors with noise and up to 99% missing data.
Tensor completion via a multi-linear low-n-rank factorization model
TLDR
A multi-linear low-n-rank factorization model is proposed and the nonlinear Gauss-Seidal method that only requires solving a linear least squares problem per iteration to solve this model is applied.
Low-Rank Matrix and Tensor Completion via Adaptive Sampling
TLDR
In the absence of noise, it is shown that one can exactly recover a n x n matrix of rank r from merely Ω(nr3/2 log(r)) matrix entries, and one can recover an order T tensor using Ω (nrT-1/2T2 log (r)) entries.
Learning with tensors: a framework based on convex optimization and spectral regularization
TLDR
A framework based on convex optimization and spectral regularization to perform learning when feature observations are multidimensional arrays (tensors) and allows one to tackle the multi-task case in a natural way.
Tensor completion for estimating missing values in visual data
TLDR
An algorithm to estimate missing values in tensors of visual data by laying out the theoretical foundations and building a working algorithm is proposed, which is more accurate and robust than heuristic approaches.
Tensor completion and low-n-rank tensor recovery via convex optimization
In this paper we consider sparsity on a tensor level, as given by the n-rank of a tensor. In an important sparse-vector approximation problem (compressed sensing) and the low-rank matrix recovery
Infinite Tucker Decomposition: Nonparametric Bayesian Models for Multiway Data Analysis
TLDR
This work proposes tensor-variate latent nonparametric Bayesian models, coupled with efficient inference methods, based on latent Gaussian or $t$ processes with nonlinear covariance functions, and develops a variational inference technique on tensors that efficiently learns the InfTucker from data.
Bayesian probabilistic matrix factorization using Markov chain Monte Carlo
TLDR
This paper presents a fully Bayesian treatment of the Probabilistic Matrix Factorization (PMF) model in which model capacity is controlled automatically by integrating over all model parameters and hyperparameters and shows that Bayesian PMF models can be efficiently trained using Markov chain Monte Carlo methods by applying them to the Netflix dataset.
...
1
2
3
4
5
...