Uniform moment bounds of Fisher's information with applications to time series

@article{Chan2012UniformMB,
  title={Uniform moment bounds of Fisher's information with applications to time series},
  author={Ngai Hang Chan and Ching-Kang Ing},
  journal={arXiv: Statistics Theory},
  year={2012}
}
  • N. Chan, C. Ing
  • Published 21 November 2012
  • Mathematics, Computer Science
  • arXiv: Statistics Theory
In this paper, a uniform (over some parameter space) moment bound for the inverse of Fisher's information matrix is established. This result is then applied to develop moment bounds for the normalized least squares estimate in (nonlinear) stochastic regression models. The usefulness of these results is illustrated using time series models. In particular, an asymptotic expression for the mean squared prediction error of the least squares predictor in autoregressive moving average models is… 
Negative Moment Bounds for Stochastic Regression Models with Deterministic Trends and Their Applications to Prediction Problems
TLDR
The authors' asymptotic expression not only helps better understand how the MSPE is affected by the deterministic and random components, but also inspires an intriguing proof of the formula for the sum of elements in the inverse of the Cauchy/Hilbert matrix from a prediction perspective.
A note on mean squared prediction error under the unit root model with deterministic trend
Assume that observations are generated from the first‐order autoregressive (AR) model with linear time trend and the unknown model coefficients are estimated by least squares. This article develops
Moment bounds and mean squared prediction errors of long-memory time series
A moment bound for the normalized conditional-sum-of-squares (CSS) estimate of a general autoregressive fractionally integrated moving average (ARFIMA) model with an arbitrary unknown memory
Moment convergence of regularized least-squares estimator for linear regression model
In this paper, we study the uniform tail-probability estimates of a regularized least-squares estimator for the linear regression model. We make use of the polynomial type large deviation inequality
Nearly Unstable Processes: A Prediction Perspective
Prediction has long been a vibrant topic in modern probability and statistics. In addition to finding optimal forecast and model selection, it is argued in this paper that the prediction principle
Mixed domain asymptotics for a stochastic process model with time trend and measurement error
We consider a stochastic process model with time trend and measurement error. We establish consistency and derive the limiting distributions of the maximum likelihood (ML) estimators of the
Moment convergence in regularized estimation under multiple and mixed-rates asymptotics
In M-estimation under standard asymptotics, the weak convergence combined with the polynomial type large deviation estimate of the associated statistical random field Yoshida (2011) provides us with
Gaussian quasi-information criteria for ergodic L\'{e}vy driven SDE
We consider relative model comparison for the parametric coefficients of a semiparametric ergodic Lévy driven model observed at high-frequency. Our asymptotics is based on the fully explicit
...
...

References

SHOWING 1-10 OF 23 REFERENCES
Time series regression with long-range dependence
A central limit theorem is established for time series regression estimates which include generalized least squares, in the presence of long-range dependence in both errors and stochastic regressors.
Properties of Predictors for Autoregressive Time Series
Abstract The prediction of the (n + s)th observation of the pth order autoregressive process is investigated. The mean square of the predictor error through terms of order n —1, conditional on Yn, Y
Properties of Predictors in Misspecified Autoregressive Time Series Models
Abstract This article investigates major effects of misspecification in stationary linear time series models when we fit a pth-order autoregressive model. The true model can be an autoregressive
Asymptotic Properties of Nonlinear Least Squares Estimates in Stochastic Regression Models
Stochastic regression models of the form y i = f i (θ)+E i , where the random disturbances E i form a martingale difference sequence with respect to an increasing sequence of -fields {G i } and f i
Predictions of multivariate autoregressive-moving average models
SUMMARY A simple formula for multiperiod predictions of multivariate autoregressive-moving average models is derived.The formula is explicitly given as a function of suitably defined parameter
Order selection for same-realization predictions in autoregressive processes
TLDR
This paper presents the first theoretical verification that AIC and its variants are still asymptotically elficient (in the sense defined in Section 4) for same-realization predictions, and shows that A IC also yields a satisfactory saute- realization prediction in finite samples.
AIC, Overfitting Principles, and the Boundedness of Moments of Inverse Matrices for Vector Autotregressions and Related Models
In his somewhat informal derivation, Akaike (in "Proceedings of the 2nd International Symposium Information Theory" (C. B. Petrov and F. Csaki, Eds.), pp. 610-624, Academici Kiado, Budapest, 1973)
MULTISTEP PREDICTION IN AUTOREGRESSIVE PROCESSES
  • C. Ing
  • Engineering
    Econometric Theory
  • 2003
In this paper, two competing types of multistep predictors, i.e., plug-in and direct predictors, are considered in autoregressive (AR) processes. When a working model AR(k) is used for the h-step
Strong consistency in nonlinear stochastic regression models
TLDR
Another set of sufficient conditions for consistency is presented, which avoid the use of partial derivatives and are closer in spirit to the conditions presented by Wu for non-stochastic regression models with independent errors.
CONSISTENCY IN NONLINEAR ECONOMETRIC MODELS: A GENERIC UNIFORM LAW OF LARGE NUMBERS
A basic tool of modern econometrics is a uniform law of large numbers (LLN). It is a primary ingredient used in proving consistency and asymptotic normality of parametric and nonparametric estimators
...
...