Variational Bayesian inference for CP tensor completion with side information

  title={Variational Bayesian inference for CP tensor completion with side information},
  author={S Budzinskiy and Nikolai Zamarashkin},
We propose a message passing algorithm, based on variational Bayesian inference, for low-rank tensor completion with automatic rank determination in the canonical polyadic format when additional side information (SI) is given. The SI comes in the form of low-dimensional subspaces the contain the fiber spans of the tensor (columns, rows, tubes, etc.). We validate the regularization properties induced by SI with extensive numerical experiments on synthetic and real-world data and present the… 



Bayesian Robust Tensor Factorization for Incomplete Multiway Data

A generative model for robust tensor factorization in the presence of both missing data and outliers that can discover the groundtruth of CP rank and automatically adapt the sparsity inducing priors to various types of outliers is proposed.

Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

A class of probabilistic generative Tucker models for tensor decomposition and completion with structural sparsity over multilinear latent space and two group sparsity inducing priors by hierarchial representation of Laplace and Student-t distributions are introduced, which facilitates fully posterior inference.

Scalable Bayesian Low-Rank Decomposition of Incomplete Multiway Tensors

A scalable Bayesian framework for low-rank decomposition of multiway tensor data with missing observations, which outperforms several state-of-the-art tensor decomposition methods on various synthetic and benchmark real-world datasets.

Alternating minimization algorithms for graph regularized tensor completion

This work considers a low-rank tensor completion (LRTC) problem which aims to recover a tensor from incomplete observations and proposes an efficient alternating minimization algorithm based on the Kurdyka-Łojasiewicz property which globally converges to a critical point of the objective function.

Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination

The method is characterized as a tuning parameter-free approach, which can effectively infer underlying multilinear factors with a low-rank constraint, while also providing predictive distributions over missing entries, which outperforms state-of-the-art approaches for both tensor factorization and tensor completion in terms of predictive performance.

Tensor factorization using auxiliary information

This paper proposes to use relationships among data as auxiliary information in addition to the low-rank assumption to improve the quality of tensor decomposition, and introduces two regularization approaches using graph Laplacians induced from the relationships.

Learning Tensor Train Representation with Automatic Rank Determination from Incomplete Noisy Data

A fully Bayesian treatment of TT decomposition is employed to enable automatic rank determination, and theoretical evidence is established for adopting a Gaussian-product-Gamma prior to induce sparsity on the slices of the TT cores, so that the model complexity is automatically determined even under incomplete and noisy observed data.

Smooth PARAFAC Decomposition for Tensor Completion

This paper considers “smoothness” constraints as well as low-rank approximations and proposes an efficient algorithm for performing tensor completion that is particularly powerful regarding visual data.

Stable ALS approximation in the TT-format for rank-adaptive tensor completion

This article introduces a singular value based regularization to the standard alternating least squares (ALS), which is motivated by averaging in microsteps and proves its stability, and derives a natural semi-implicit rank adaption strategy.

Tensor completion in hierarchical tensor representations

This book chapter considers versions of iterative hard thresholding schemes adapted to hierarchical tensor formats and provides first partial convergence results based on a tensor version of the restricted isometry property (TRIP) of the measurement map.