A new look at the statistical model identification

@article{Akaike1974ANL,
  title={A new look at the statistical model identification},
  author={H. Akaike},
  journal={IEEE Transactions on Automatic Control},
  year={1974},
  volume={19},
  pages={716-723}
}
  • H. Akaike
  • Published 1974
  • Mathematics
  • IEEE Transactions on Automatic Control
The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as the procedure for statistical model identification. The classical maximum likelihood estimation procedure is reviewed and a new estimate minimum information theoretical criterion (AIC) estimate (MAICE) which is designed for the purpose of statistical identification is introduced. When there are… Expand

Figures from this paper

Automatic transaction of signal via statistical modeling
TLDR
A simple explanation of statistical modeling based on the AIC is given and four examples of applying the minimum AIC procedure to an automatic transaction of signals observed in the earth sciences are demonstrated. Expand
Likelihood and its use in Parameter Estimation and Model Comparison
Parameter estimation and model fitting underlie many statistical procedures. Whether the objective is to examine central tendency or the slope of a regression line, an estimation method must be used.Expand
Automatic Transaction of Signal via Statistical Modeling
TLDR
This study gives a simple explanation of statistical modeling based on the AIC and demonstrates four examples of applying the minimum AIC procedure to an automatic transaction of signals observed in the earth sciences. Expand
MODERN DEVELOPMENT OF STATISTICAL METHODS
Publisher Summary This chapter discusses the use of the minimum akaike information criterion estimation (MAICE) procedure and its conceptual generalization, the entropy maximization principle, inExpand
New Criteria for Selection in Simultaneous Equations Model
When the errors of statistical models are not independent, such as in the existence of the autocorrelation (AR) and/or moving average (MA) problems, the values of the standard model selectionExpand
Information theory measures with application to model identification
TLDR
Diverse applications of the AIC, which the paper has investigated, include time series modeling, parametric inverse problems, and spectral analysis, and it is found to be both a versatile and a robust criterion. Expand
Bayesian information criterion for longitudinal and clustered data.
TLDR
This paper develops a method for calculating the 'effective sample size' for mixed models based on Fisher's information, which replaces the sample size in BIC and can vary from the number of subjects to thenumber of observations. Expand
Empirical likelihood based variable selection
Information criteria form an important class of model/variable selection methods in statistical analysis. Parametric likelihood is a crucial part of these methods. In some applications such as theExpand
On the Likelihood of a Time Series Model
TLDR
By asking the log likelihood of a model to be an unbiased estimate of the expectedlog likelihood of the model, a reasonable definition of the likelihood is obtained and this allows us to develop a systematic approach to parametric time series modelling. Expand
Canonical Correlation Analysis of Time Series and the Use of an Information Criterion
Publisher Summary This chapter starts with a brief introductory review of some of the recent developments of time-series analysis. One of the most established procedures of time-series analysis isExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 54 REFERENCES
The statistical approach to the analysis of time-series
  • M. Bartlett
  • Mathematics, Computer Science
  • Trans. IRE Prof. Group Inf. Theory
  • 1953
TLDR
The relation is examined between the information (entropy) concept used in communication theory, associated with specification, and Fisher's information concept using in statistics,associated with inference. Expand
Some tests of separate families of hypotheses in time series analysis.
SUMMARY This paper deals with the application to various problems in time series analysis of the maximum-likelihood ratio procedure proposed by Cox for situations in which there are rival hypothesesExpand
The behavior of maximum likelihood estimates under nonstandard conditions
This paper proves consistency and asymptotic normality of maximum likelihood (ML) estimators under weaker conditions than usual. In particular, (i) it is not assumed that the true distributionExpand
On-line identification of linear dynamic systems with applications to Kalman filtering
Kalman gave a set of recursive equations for estimating the state of a linear dynamic system. However, the Kalman filter requires a knowledge of all the system and noise parameters. Here it isExpand
Fitting autoregressive models for prediction
This is a preliminary report on a newly developed simple and practical procedure of statistical identification of predictors by using autoregressive models. The use of autoregressive representationExpand
Comparison of different methods for identification of industrial processes
Plants have been modelled using different identification methods. Some results from the identification of the dynamics of a nuclear reactor, a distillation column, a superheater and a paper machineExpand
On the identification of variances and adaptive Kalman filtering
A Kalman filter requires an exact knowledge of the process noise covariance matrix Q and the measurement noise covariance matrix R . Here we consider the case in which the true values of Q and R areExpand
A Monte Carlo Comparison of the Regression Method and the Spectral Methods of Prediction
Abstract We consider the question of estimating the linear, least-squares predictor of the future values of a real-valued, discrete, purely nondeterministic, stationary time series from its knownExpand
The prediction error of stationary Gaussian time series of unknown covariance
TLDR
The asymptotic form of the mean square prediction error is found for a stationary Gaussian time series when the prediction is a linear weighting of the immediate past, the weights being "learned" from the data. Expand
Autoregressive model fitting for control
The use of a multidimensional extension of the minimum final prediction error (FPE) criterion which was originally developed for the decision of the order of one-dimensional autoregressive processExpand
...
1
2
3
4
5
...