Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation
@article{Ding2019LowRankTC, title={Low-Rank Tensor Completion Using Matrix Factorization Based on Tensor Train Rank and Total Variation}, author={Meng Ding and Tingzhu Huang and Teng-Yu Ji and Xile Zhao and Jinghua Yang}, journal={Journal of Scientific Computing}, year={2019}, volume={81}, pages={941 - 964}, url={https://api.semanticscholar.org/CorpusID:202938666} }
An optimization model combining low-rank matrix factorization based on tensor train (TT) rank and the total variation to retain the strength of TT rank and alleviate block-artifacts is built.
Topics
Hyperspectral Images Recovery (opens in a new tab)SiLRTC-TT (opens in a new tab)Matricization Scheme (opens in a new tab)Ket Augmentation (opens in a new tab)Blocking Artifacts (opens in a new tab)Total Variation (opens in a new tab)Tensor Trains (opens in a new tab)TMac-TT (opens in a new tab)Lower-order Tensors (opens in a new tab)Parallel Matrix Factorization (opens in a new tab)
80 Citations
Low-rank tensor completion based on tensor train rank with partially overlapped sub-blocks
- 2022
Computer Science
Tensor train rank minimization with nonlocal self-similarity for tensor completion
- 2021
Computer Science
The alternating direction method of multipliers tailored for the specific structure to solve the proposed model is developed and is superior to several existing state-of-the-art methods in terms of both qualitative and quantitative measures.
An Efficient Tensor Completion Method Via New Latent Nuclear Norm
- 2020
Computer Science, Mathematics
The new latent nuclear norm equipped with a more balanced unfolding scheme is defined for low-rank regularizer and together with the Frank-Wolfe (FW) algorithm is developed as an efficient completion method by utilizing the sparsity structure of observed tensor.
Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery
- 2020
Computer Science, Mathematics
Low-Rank Tensor Completion With Generalized CP Decomposition and Nonnegative Integer Tensor Completion
- 2023
Computer Science, Mathematics
This work attempts to construct a new methodological framework called GCDTC (Generalized CP Decomposition Tensor Completion) based on numerical properties to achieve higher accuracy in tensor completion to achieve state-of-the-art methodologies.
Enhanced Nonconvex Low-Rank Approximation of Tensor Multi-Modes for Tensor Completion
- 2021
Computer Science, Mathematics
This paper proposes a novel low-rank approximation of tensor multi-modes (LRATM), in which a double nonconvex $L_{\gamma }$ norm is designed to represent the underlying joint-manifold drawn from the factorization factors of each mode in the underlying tensor.
A high-order tensor completion algorithm based on Fully-Connected Tensor Network weighted optimization
- 2022
Computer Science, Mathematics
This paper proposes a new tensor completion method named the fully connected tensor network weighted optization (FCTN-WOPT), which per-forms a composition of the completed tensor by initialising the factors from the FCTN decomposition.
Low-rank tensor completion via combined Tucker and Tensor Train for color image recovery
- 2021
Computer Science
A new tensor completion model is proposed, which combines Tucker rank and Tensor Train rank, and an efficient alternating direction method based algorithm is developed to tackle the effectiveness of the model.
Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors
- 2022
Computer Science, Mathematics
This paper proposes a novel tensor completion model by introducing a factor-based regularization to the framework of the FCTN decomposition and develops an efficient proximal alternating minimization (PAM)-based algorithm and theoretically demonstrates its convergence.
Multi-Dimensional Visual Data Completion via Low-Rank Tensor Representation Under Coupled Transform
- 2021
Computer Science
A novel low-rank tensor representation based on coupled transform, which fully exploits the spatial multi-scale nature and redundancy in spatial and spectral/temporal dimensions, leading to a better low tensor multi-rank approximation.
41 References
Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train
- 2017
Computer Science
A novel approach to tensor completion, which recovers missing entries of data represented by tensors based on the tensor train (TT) rank, which is able to capture hidden information from tensors thanks to its definition from a well-balanced matricization scheme.
Low-rank tensor completion via smooth matrix factorization
- 2019
Computer Science, Mathematics
Parallel matrix factorization for low-rank tensor completion
- 2013
Computer Science, Environmental Science
Although the model is non-convex, the algorithm performs consistently throughout the tests and gives better results than the compared methods, some of which are based on convex models.
Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery
- 2014
Computer Science, Mathematics
The new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.
Tensor completion and low-n-rank tensor recovery via convex optimization
- 2011
Computer Science, Mathematics
This paper uses the n-rank of a tensor as a sparsity measure and considers the low-n-rank tensor recovery problem, i.e. the problem of finding the tensor of the lowest n-Rank that fulfills some linear constraints.
Tensor Completion for Estimating Missing Values in Visual Data
- 2013
Computer Science
An algorithm to estimate missing values in tensors of visual data by proposing the first definition of the trace norm for tensors and building a working algorithm that generalizes the established definition of the matrix trace norm.
Tensor completion using total variation and low-rank matrix factorization
- 2016
Computer Science, Mathematics
A Mixture of Nuclear Norm and Matrix Factorization for Tensor Completion
- 2017
Computer Science, Mathematics
A mixture model for tensor completion by combining the nuclear norm with the low-rank matrix factorization is proposed, and it is proved that every cluster point of the sequence generated by NS- LRTC or S-LRTC is a stationary point.
Exact Low Tubal Rank Tensor Recovery from Gaussian Measurements
- 2018
Computer Science, Mathematics
It is shown that by solving a TNN minimization problem, the underlying tensor of size n1×n2×n3 with tubal rank r can be exactly recovered when the given number of Gaussian measurements is O(r(n1+n2−r)n3).
Matrix factorization for low-rank tensor completion using framelet prior
- 2018
Computer Science, Mathematics