• Corpus ID: 227989940

# TenIPS: Inverse Propensity Sampling for Tensor Completion

@inproceedings{Yang2021TenIPSIP,
title={TenIPS: Inverse Propensity Sampling for Tensor Completion},
author={Chengrun Yang and Lijun Ding and Ziyang Wu and Madeleine Udell},
booktitle={International Conference on Artificial Intelligence and Statistics},
year={2021}
}
• Published in
International Conference on…
1 January 2021
• Computer Science
Tensors are widely used to represent multiway arrays of data. The recovery of missing entries in a tensor has been extensively studied, generally under the assumption that entries are missing completely at random (MCAR). However, in most practical settings, observations are missing not at random (MNAR): the probability that a given entry is observed (also called the propensity) may depend on other entries in the tensor or even on the value of the missing entry. In this paper, we study the…
4 Citations

## Figures and Tables from this paper

• Computer Science
Journal of the American Statistical Association
• 2022
Covariate-assisted Sparse Tensor Completion (COSTCO) is proposed to incorporate covariate information for the recovery of the sparse tensor to jointly extract latent components from both the tensor and the covariate matrix to learn a synthetic representation.
• Computer Science, Mathematics
IEEE Transactions on Information Theory
• 2022
Two new estimators are proposed, based on singular value thresholding and nuclear norm minimization, to recover the matrix under this assumption, and involve no tuning parameters, and are shown to be consistent under a low rank assumption.
• Computer Science
• 2020
A new class of tensor rank regularizers based on the Euclidean norms of the CP component vectors of a tensor is proposed and it is proved that for LRTC with Schatten-$p$ quasi-norm regularizer on $d$-order tensors, p=1/d is always better than any $p> 1/d$ in terms of the generalization ability.
• Computer Science
2022 30th European Signal Processing Conference (EUSIPCO)
• 2022
Through a series of experiments, this paper studies and compares the performance of various LRMC algorithms that were originally successful for data-independent sampling patterns and considers various settings where the sampling mask is dependent on the underlying data values.

## References

SHOWING 1-10 OF 37 REFERENCES

• Computer Science, Mathematics
NIPS
• 2014
A novel alternating minimization based method which iteratively refines estimates of the singular vectors which can recover a three-mode n × n → n dimensional rank-r tensor exactly from O(n3/2r5 log4 n) randomly sampled entries.
• Computer Science
J. Imaging
• 2021
An efficient weighted HOSVD algorithm is proposed for recovery of the underlying low-rank tensor from noisy observations and then derive the error bounds under a properly weighted metric.
• Computer Science
NeurIPS
• 2020
This paper introduces a new variant of alternating minimization, which is inspired by understanding how the progress measures that guide convergence of alternating maximization in the matrix setting need to be adapted to the tensor setting, and shows strong provable guarantees, including showing that the algorithm converges linearly to the true tensors even when the factors are highly correlated.
• Computer Science
ICML
• 2014
The new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.
• Computer Science
NIPS
• 2013
In the absence of noise, it is shown that one can exactly recover a n x n matrix of rank r from merely Ω(nr3/2 log(r)) matrix entries, and one can recover an order T tensor using Ω (nrT-1/2T2 log (r)) entries.
• Mathematics, Computer Science
2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
• 2018
This study extends a delay embedding for a time series to a "multi-way delay-embedding transform" for a tensor, which takes a given incomplete tensor as the input and outputs a higher-order incomplete Hankel tensor.
• Computer Science
The Annals of Statistics
• 2021
The fundamental statistical limits of noisy tensor completion are characterized by establishing minimax optimal rates of convergence for estimating a $k$th order low rank tensor under the general $\ell_p$ ($1\le p\le 2$) norm which suggest significant room for improvement over the existing approaches.
This article proposes a framework for low-rank tensor completion via a novel tensor measurement scheme called Cross and develops a theoretical upper bound and the matching minimax lower bound for recovery error over certain classes of low- rank tensors for the proposed procedure.
• Computer Science, Mathematics
J. Mach. Learn. Res.
• 2017
This work considers the problem of low canonical polyadic rank tensor completion and proposes a combinatorial method to derive a lower bound on the sampling probability of the tensor, or equivalently, the number of sampled entries that guarantees finite completability with high probability.
• Computer Science
Image Processing: Algorithms and Systems
• 2018
This work introduces a novel approach for the recovery of a low-rank tensor from a small number of binary measurements, called 1bit Tensor Completion, which relies on the application of 1-bit matrix completion over different matricizations of the underlying tensor.