• Corpus ID: 238259048

Multi-linear Tensor Autoregressive Models

@inproceedings{Li2021MultilinearTA,
  title={Multi-linear Tensor Autoregressive Models},
  author={Zebang Li and Han Xiao},
  year={2021}
}
Contemporary time series analysis has seen more and more tensor type data, from many fields. For example, stocks can be grouped according to Size, Book-to-Market ratio, and Operating Profitability, leading to a 3-way tensor observation at each month. We propose an autoregressive model for the tensor-valued time series, with autoregressive terms depending on multi-linear coefficient matrices. Comparing with the traditional approach of vectoring the tensor observations and then applying the… 

An efficient tensor regression for high-dimensional data

TLDR
This paper revises a newly proposed tensor train (TT) decomposition and then applies it to tensor regression such that a nice statistical interpretation can be obtained and can lead to a better interpretation for the data with factorial structures.

Analysis of Tensor Time Series: tensorTS

TLDR
This paper introduces a R package tensorTS which implements the methodologies proposed in a series of recent papers, including functions for estimation, model selection, prediction and visualization for both tensor autoregressive models and dynamic tensor factor models.

References

SHOWING 1-10 OF 22 REFERENCES

Learning High-Dimensional Generalized Linear Autoregressive Models

TLDR
This paper addresses the inference of the autoregressive parameters and associated network structure within a generalized linear model framework that includes Poisson and Bernoulli autore progressive processes and at the heart of this analysis is a sparsity-regularized maximum likelihood estimator.

Autoregressive models for matrix-valued time series

Regularized estimation in sparse high-dimensional time series models

TLDR
A measure of stability for stationary processes using their spectral properties that provides insight into the effect of dependence on the accuracy of the regularized estimates and establishes some useful deviation bounds for dependent data, which can be used to study several important regularization estimates in a time series setting.

Sparse Vector Autoregressive Modeling

TLDR
A two-stage approach for fitting sparse VAR (sVAR) models in which many of the AR coefficients are zero is proposed, based on an estimate of the partial spectral coherence (PSC) together with the use of BIC.

High-Dimensional Posterior Consistency in Bayesian Vector Autoregressive Models

TLDR
A VAR model with two prior choices for the autoregressive coefficient matrix is considered: a nonhierarchical matrix-normal prior and a hierarchical prior, which corresponds to an arbitrary scale mixture of normals, which establishes posterior consistency for both these priors under standard regularity assumptions.

Low Rank and Structured Modeling of High-Dimensional Vector Autoregressions

TLDR
This work introduces a novel approach for estimating low-rank and structured sparse high-dimensional VAR models using a regularized framework involving a combination of nuclear norm and lasso penalties, and establishes nonasymptotic probabilistic upper bounds on the estimation error rates of the low- rank and the structured sparse components.

High Dimensional and Banded Vector Autoregressions

TLDR
A Bayesian information criterion for determining the width of the bands in the coefficient matrices is proposed, which is proved to be consistent and consistent estimators for the auto-covariance matrices are constructed.

Factor Models for High-Dimensional Tensor Time Series

TLDR
This article presents a factor model approach to the analysis of high-dimensional dynamic tensor time series and multi-category dynamic transport networks and presents two estimation procedures along with their theoretical properties and simulation results.

Estimation of a Multiplicative Correlation Structure in the Large Dimensional Case

We propose a Kronecker product structure for large covariance or correlation matrices. One feature of this model is that it scales logarithmically with dimension in the sense that the number of free

Vector linear time series models

This paper presents proofs of the strong law of large numbers and the central limit theorem for estimators of the parameters in quite general finite-parameter linear models for vector time series.