• Corpus ID: 239998486

Streaming Generalized Canonical Polyadic Tensor Decompositions

@article{Phipps2021StreamingGC,
  title={Streaming Generalized Canonical Polyadic Tensor Decompositions},
  author={Eric T. Phipps and Nick P. Johnson and Tamara G. Kolda},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.14514}
}
In this paper, we develop a method which we call OnlineGCP for computing the Generalized Canonical Polyadic (GCP) tensor decomposition of streaming data. GCP differs from traditional canonical polyadic (CP) tensor decompositions as it allows for arbitrary objective functions which the CP model attempts to minimize. This approach can provide better fits and more interpretable models when the observed tensor data is strongly non-Gaussian. In the streaming case, tensor data is gradually observed… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 31 REFERENCES
Accelerating Online CP Decompositions for Higher Order Tensors
TLDR
This work proposes an efficient online algorithm that can incrementally track the CP decompositions of dynamic tensors with an arbitrary number of dimensions and shows not only significantly better decomposition quality, but also better performance in terms of stability, efficiency and scalability.
Generalized Canonical Polyadic Tensor Decomposition
TLDR
This work develops a generalized canonical polyadic (GCP) low-rank tensor decomposition that allows other loss functions besides squared error, and presents a variety statistically-motivated loss functions for various scenarios.
Probabilistic Streaming Tensor Decomposition
TLDR
This work proposes POST, a PrObabilistic Streaming Tensor decomposition algorithm, which enables real-time updates and predictions upon receiving new tensor entries, and supports dynamic growth of all the modes.
Streaming Tensor Factorization for Infinite Data Sources
TLDR
CP-stream is presented, an algorithm for streaming sparse tensor factorization in the model of the canonical polyadic decomposition which does not grow linearly in time or space, and is thus practical for long-term streaming.
Multi-Aspect Streaming Tensor Completion
TLDR
A Multi-Aspect Streaming Tensor completion framework (MAST) based on CANDECOMP/PARAFAC (CP) decomposition to track the subspace of general incremental tensors for completion and investigates a special situation where time is one mode of the tensors, and leverages its extra structure information to improve the general framework towards higher effectiveness.
Stochastic Gradients for Large-Scale Tensor Decomposition
TLDR
This work proposes using stochastic gradients for efficient generalized canonical polyadic tensor decomposition of large-scale tensors of sparse and dense tensors using two types of stratified sampling that give precedence to sampling nonzeros.
SamBaTen: Sampling-based Batch Incremental Tensor Decomposition
TLDR
SaMbaTen is introduced, a Sampling-based Batch Incremental Tensor Decomposition algorithm, which incrementally maintains the decomposition given new updates to the tensor dataset, and achieves comparable accuracy to state-of-the-art incremental and non-incremental techniques, while being 25-30 times faster.
On Tensors, Sparsity, and Nonnegative Factorizations
TLDR
This paper proposes that the random variation is best described via a Poisson distribution, which better describes the zeros observed in the data as compared to the typical assumption of a Gaussian distribution, and presents a new algorithm for Poisson tensor factorization called CANDECOMP--PARAFAC alternating Poisson regression (CP-APR), based on a majorization-minimization approach.
Adaptive Algorithms to Track the PARAFAC Decomposition of a Third-Order Tensor
TLDR
Two adaptive algorithms to update the decomposition of a PARAFAC decomposition at instant t+1 are proposed, the new tensor being obtained from the old one after appending a new slice in the 'time' dimension.
Efficient MATLAB Computations with Sparse and Factored Tensors
TLDR
This paper considers how specially structured tensors allow for efficient storage and computation, and proposes storing sparse tensors using coordinate format and describes the computational efficiency of this scheme for various mathematical operations, including those typical to tensor decomposition algorithms.
...
1
2
3
4
...