# Variable selection in sparse GLARMA models

@article{Gomtsyan2020VariableSI, title={Variable selection in sparse GLARMA models}, author={Marina Gomtsyan and C{\'e}line L{\'e}vy-Leduc and Sarah Ouadah and Laure Sansonnet and Thomas Blein}, journal={Statistics}, year={2020}, volume={56}, pages={755 - 784} }

In this paper, we propose a novel and efficient two-stage variable selection approach for sparse GLARMA models, which are pervasive for modelling discrete-valued time series. Our approach consists in iteratively combining the estimation of the autoregressive moving average (ARMA) coefficients of GLARMA models with regularized methods designed for performing variable selection in regression coefficients of Generalized Linear Models (GLM). We first establish the consistency of the ARMA part…

## One Citation

### Variable selection in sparse multivariate GLARMA models: Application to germination control by environment

- Computer Science
- 2022

This work proposes a novel and eﬁcient iterative two-stage variable selection approach for multivariate sparse GLARMA models, which can be used for modelling multivariate discrete-valued time series and is able to outperform the other methods for recovering the null and non-null coeﬃcients.

## References

SHOWING 1-10 OF 32 REFERENCES

### Markov regression models for time series: a quasi-likelihood approach.

- MathematicsBiometrics
- 1988

A quasi-likelihood (QL) approach to regression analysis with time series data is discussed, analogous to QL for independent observations, large-sample properties of the regression coefficients depend only on correct specification of the first conditional moment.

### Statistical Learning with Sparsity: The Lasso and Generalizations

- Computer Science
- 2015

Statistical Learning with Sparsity: The Lasso and Generalizations presents methods that exploit sparsity to help recover the underlying signal in a set of data and extract useful and reproducible patterns from big datasets.

### Stability selection

- Computer Science
- 2010

It is proved for the randomized lasso that stability selection will be variable selection consistent even if the necessary conditions for consistency of the original lasso method are violated.

### Efficient order selection algorithms for integer‐valued ARMA processes

- Mathematics, Computer Science
- 2009

A very efficient reversible jump Markov chain Monte Carlo (RJMCMC) algorithm is constructed for moving between INARMA processes of different orders and an alternative in the form of the EM algorithm is given for determining the order of an integer‐valued autoregressive (INAR) process.

### MCMC for Integer‐Valued ARMA processes

- Mathematics, Computer Science
- 2007

An efficient Markov chain Monte Carlo algorithm for a wide class of integer‐valued autoregressive moving‐average (INARMA) processes is outlined and its inferential and predictive capabilities are assessed.

### Poisson Autoregression

- Mathematics
- 2008

In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its…

### The glarma Package for Observation-Driven Time Series Regression of Counts

- Mathematics
- 2015

We review the theory and application of generalized linear autoregressive moving average observation-driven models for time series of counts with explanatory variables and describe the estimation of…

### SOME SIMPLE MODELS FOR DISCRETE VARIATE TIME SERIES

- Mathematics
- 1985

ABSTRACT: Simple models are presented for use in the modeling and generation of sequences of dependent discrete random variables. The models are essentially Markov Chains, but are structurally…

### Regularization Paths for Generalized Linear Models via Coordinate Descent.

- Computer ScienceJournal of statistical software
- 2010

In comparative timings, the new algorithms are considerably faster than competing methods and can handle large problems and can also deal efficiently with sparse features.