Influence-guided Data Augmentation for Neural Tensor Completion
@article{Oh2021InfluenceguidedDA, title={Influence-guided Data Augmentation for Neural Tensor Completion}, author={Sejoon Oh and Sungchul Kim and Ryan A. Rossi and Srijan Kumar}, journal={Proceedings of the 30th ACM International Conference on Information \& Knowledge Management}, year={2021} }
How can we predict missing values in multi-dimensional data (or tensors) more accurately? The task of tensor completion is crucial in many applications such as personalized recommendation, image and video restoration, and link prediction in social networks. Many tensor factorization and neural network-based tensor completion algorithms have been developed to predict missing entries in partially observed tensors. However, they can produce inaccurate estimations as real-world tensors are very…
Figures and Tables from this paper
One Citation
Training Data Influence Analysis and Estimation: A Survey
- Computer ScienceArXiv
- 2022
This paper provides the first comprehensive survey of training data influence analysis and estimation and organizes state-of-the-art influence analysis methods into a taxonomy; it describes each of these methods in detail and compare their underlying assumptions, asymptotic complexities, and overall strengths and weaknesses.
References
SHOWING 1-10 OF 47 REFERENCES
CoSTCo: A Neural Tensor Completion Model for Sparse Tensors
- Computer ScienceKDD
- 2019
This work proposes a novel convolutional neural network (CNN) based model, named CoSTCo (Convolutional Sparse Tensor Completion), which leverages the expressive power of CNN to model the complex interactions inside tensors and its parameter sharing scheme to preserve the desired low-rank structure.
Neural Tensor Completion for Accurate Network Monitoring
- Computer ScienceIEEE INFOCOM 2020 - IEEE Conference on Computer Communications
- 2020
This paper proposes a novel Neural Tensor Completion (NTC) scheme to effectively model three-order interaction among data features with the outer product and build a 3D interaction map, and applies 3D convolution to learn features of high- order interaction from the local range to the global range.
MTC: Multiresolution Tensor Completion from Partial and Coarse Observations
- Computer ScienceKDD
- 2021
The proposed Multi-resolution Tensor Completion model (MTC) explores tensor mode properties and leverages the hierarchy of resolutions to recursively initialize an optimization setup, and optimizes on the coupled system using alternating least squares to ensure low computational and space complexity.
Fully Scalable Methods for Distributed Tensor Factorization
- Computer ScienceIEEE Transactions on Knowledge and Data Engineering
- 2017
This paper proposes two distributed tensor factorization methods, CDTF and SALS, which are scalable with all aspects of data and show a trade-off between convergence speed and memory requirements.
Scalable Tensor Decompositions for Multi-aspect Data Mining
- Computer Science2008 Eighth IEEE International Conference on Data Mining
- 2008
Memory-Efficient Tucker (MET) is proposed, which achieves over 1000X space reduction without sacrificing speed; it also allows us to work with much larger tensors that were too big to handle before.
Tucker factorization with missing data with application to low-$$n$$n-rank tensor completion
- Computer ScienceMultidimens. Syst. Signal Process.
- 2015
This paper proposes a simple algorithm for Tucker factorization of a tensor with missing data and its application to low-$$n$$n-rank tensor completion and demonstrates in several numerical experiments that the proposed algorithm performs well even when the ranks are significantly overestimated.
Neural Tensor Model for Learning Multi-Aspect Factors in Recommender Systems
- Computer ScienceIJCAI
- 2020
A novel nonlinear tensor machine is proposed, which combines deep neural networks and tensor algebra to capture nonlinear interactions among multi-aspect factors.
Neural Tensor Factorization for Temporal Interaction Learning
- Computer ScienceWSDM
- 2019
A Neural network based Tensor Factorization (NTF) model for predictive tasks on dynamic relational data that incorporates the multi-layer perceptron structure for learning the non-linearities between different latent factors.
Scalable Tucker Factorization for Sparse Tensors - Algorithms and Discoveries
- Computer Science2018 IEEE 34th International Conference on Data Engineering (ICDE)
- 2018
P-Tucker is proposed, a scalable Tucker factorization method for sparse tensors that successfully discover hidden concepts and relations in a large-scale real-world tensor, while existing methods cannot reveal latent features due to their limited scalability or low accuracy.
Generalized Higher-Order Orthogonal Iteration for Tensor Decomposition and Completion
- Computer ScienceNIPS
- 2014
This work proposes an efficient and scalable core tensor Schatten 1-norm minimization method for simultaneous tensor decomposition and completion, with a much lower computational complexity.