# Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination

@article{Zhao2015BayesianCF, title={Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination}, author={Qibin Zhao and Liqing Zhang and Andrzej Cichocki}, journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, year={2015}, volume={37}, pages={1751-1763} }

CANDECOMP/PARAFAC (CP) tensor factorization of incomplete data is a powerful technique for tensor completion through explicitly capturing the multilinear latent factors. The existing CP algorithms require the tensor rank to be manually specified, however, the determination of tensor rank remains a challenging problem especially for CP rank . In addition, existing approaches do not take into account uncertainty information of latent factors, as well as missing entries. To address these issues…

## Figures, Tables, and Topics from this paper

## 320 Citations

Tensor Rank Estimation and Completion via CP-based Nuclear Norm

- Computer ScienceCIKM
- 2017

Tensor Rank Estimation based on $L_1$-regularized orthogonal CP decomposition (TREL1) is proposed, which incorporates a regularization with CP-based tensor nuclear norm when minimizing the reconstruction error in TC to automatically determine the rank of an incomplete tensor.

Towards Probabilistic Tensor Canonical Polyadic Decomposition 2.0: Automatic Tensor Rank Learning Using Generalized Hyperbolic Prior

- Computer Science, EngineeringArXiv
- 2020

A more advanced generalized hyperbolic (GH) prior is introduced to the probabilistic CPD model, which not only includes the Gaussian-gamma model as a special case, but also provides more flexibilities to adapt to different levels of sparsity.

Bayesian Nonparametric Tensor Completion

- 2016

In this paper, we propose a Bayesian nonparametric method to estimate missing data in tensors. The proposed method uses a Tucker-1 factorization to learn a smaller core tensor and a factor matrix via…

Scalable Bayesian Low-Rank Decomposition of Incomplete Multiway Tensors

- Mathematics, Computer ScienceICML
- 2014

A scalable Bayesian framework for low-rank decomposition of multiway tensor data with missing observations, which outperforms several state-of-the-art tensor decomposition methods on various synthetic and benchmark real-world datasets.

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

- Computer Science, MathematicsArXiv
- 2015

A class of probabilistic generative Tucker models for tensor decomposition and completion with structural sparsity over multilinear latent space and two group sparsity inducing priors by hierarchial representation of Laplace and Student-t distributions are introduced, which facilitates fully posterior inference.

General-Purpose Bayesian Tensor Learning With Automatic Rank Determination and Uncertainty Quantification

- MedicineFrontiers in Artificial Intelligence
- 2021

A major challenge in many machine learning tasks is that the model expressive power depends on model size. Low-rank tensor methods are an efficient tool for handling the curse of dimensionality in…

Low-Rank Tensor Completion: A Pseudo-Bayesian Learning Approach

- Computer Science2017 IEEE International Conference on Computer Vision (ICCV)
- 2017

This paper proposes a pseudo-Bayesian approach, where a Bayesian-inspired cost function is adjusted using appropriate approximations that lead to desirable attributes including concavity and symmetry, and proves the ability to recover the true tensor with a low multilinear rank.

Bayesian Low Rank Tensor Ring Model for Image Completion

- Computer Science, MathematicsArXiv
- 2020

Numerical Experiments show that the proposed approach outperforms state-of-the-art ones, especially in terms of recovery accuracy, and the TR ranks can be obtained by Bayesian inference.

Probabilistic Tensor Canonical Polyadic Decomposition With Orthogonal Factors

- Mathematics, Computer ScienceIEEE Transactions on Signal Processing
- 2017

A novel tensor CPD algorithm based on the probabilistic inference framework is devised, based on which an inference algorithm is proposed that alternatively estimates the factor matrices, recovers the tensor rank, and mitigates the outliers.

A Fused CP Factorization Method for Incomplete Tensors

- Mathematics, Computer ScienceIEEE Transactions on Neural Networks and Learning Systems
- 2019

A modified CP tensor factorization framework that fuses the <inline-formula> norm constraint, sparseness, manifold, and smooth information simultaneously, which reveals the characteristics of commonly used regularizations for tensor completion in a certain sense and gives experimental guidance concerning how to use them.

## References

SHOWING 1-10 OF 61 REFERENCES

Simultaneous Tensor Decomposition and Completion Using Factor Priors

- Mathematics, Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2014

This paper proposes a method called simultaneous tensor decomposition and completion (STDC) that combines a rank minimization technique with Tucker model decomposition, and uses factor priors, which are usually known a priori in real-world tensor objects, to characterize the underlying joint-manifold drawn from the model factors.

Tensor factorization using auxiliary information

- Mathematics, Computer ScienceData Mining and Knowledge Discovery
- 2012

This paper proposes to use relationships among data as auxiliary information in addition to the low-rank assumption to improve the quality of tensor decomposition, and introduces two regularization approaches using graph Laplacians induced from the relationships.

Scalable Tensor Factorizations for Incomplete Data

- Computer Science, MathematicsArXiv
- 2010

An algorithm called CP-WOPT (CP Weighted OPTimization) that uses a first-order optimization approach to solve the weighted least squares problem and is shown to successfully factorize tensors with noise and up to 99% missing data.

Tensor completion via a multi-linear low-n-rank factorization model

- Mathematics, Computer ScienceNeurocomputing
- 2014

A multi-linear low-n-rank factorization model is proposed and the nonlinear Gauss-Seidal method that only requires solving a linear least squares problem per iteration to solve this model is applied.

Low-Rank Matrix and Tensor Completion via Adaptive Sampling

- Computer Science, MathematicsNIPS
- 2013

In the absence of noise, it is shown that one can exactly recover a n x n matrix of rank r from merely Ω(nr3/2 log(r)) matrix entries, and one can recover an order T tensor using Ω (nrT-1/2T2 log (r)) entries.

Learning with tensors: a framework based on convex optimization and spectral regularization

- Mathematics, Computer ScienceMachine Learning
- 2013

A framework based on convex optimization and spectral regularization to perform learning when feature observations are multidimensional arrays (tensors) and allows one to tackle the multi-task case in a natural way.

Tensor completion for estimating missing values in visual data

- Medicine, Mathematics2009 IEEE 12th International Conference on Computer Vision
- 2009

An algorithm to estimate missing values in tensors of visual data by laying out the theoretical foundations and building a working algorithm is proposed, which is more accurate and robust than heuristic approaches.

Tensor completion and low-n-rank tensor recovery via convex optimization

- Mathematics
- 2011

In this paper we consider sparsity on a tensor level, as given by the n-rank of a tensor. In an important sparse-vector approximation problem (compressed sensing) and the low-rank matrix recovery…

Infinite Tucker Decomposition: Nonparametric Bayesian Models for Multiway Data Analysis

- Computer Science, MathematicsICML
- 2012

This work proposes tensor-variate latent nonparametric Bayesian models, coupled with efficient inference methods, based on latent Gaussian or $t$ processes with nonlinear covariance functions, and develops a variational inference technique on tensors that efficiently learns the InfTucker from data.

Bayesian probabilistic matrix factorization using Markov chain Monte Carlo

- Computer ScienceICML '08
- 2008

This paper presents a fully Bayesian treatment of the Probabilistic Matrix Factorization (PMF) model in which model capacity is controlled automatically by integrating over all model parameters and hyperparameters and shows that Bayesian PMF models can be efficiently trained using Markov chain Monte Carlo methods by applying them to the Netflix dataset.