# The Akaike information criterion: Background, derivation, properties, application, interpretation, and refinements

@article{Cavanaugh2019TheAI, title={The Akaike information criterion: Background, derivation, properties, application, interpretation, and refinements}, author={Joseph E. Cavanaugh and Andrew A. Neath}, journal={Wiley Interdisciplinary Reviews: Computational Statistics}, year={2019}, volume={11} }

The Akaike information criterion (AIC) is one of the most ubiquitous tools in statistical modeling. The first model selection criterion to gain widespread acceptance, AIC was introduced in 1973 by Hirotugu Akaike as an extension to the maximum likelihood principle. Maximum likelihood is conventionally applied to estimate the parameters of a model once the structure and dimension of the model have been formulated. Akaike's seminal idea was to combine into a single procedure the process of…

## 124 Citations

### LINEAR, GENERALIZED, HIERARCHICAL, BAYESIAN AND RANDOM REGRESSION MIXED MODELS IN GENETICS/GENOMICS IN PLANT BREEDING

- Mathematics
- 2020

This paper presents the state of the art of the statistical modelling as applied to plant breeding. Classes of inference, statistical models, estimation methods and model selection are emphasized in…

### On using predictive-ability tests in the selection of time-series prediction models: A Monte Carlo evaluation

- Computer Science
- 2018

### Navigating the Statistical Minefield of Model Selection and Clustering in Neuroscience

- Computer ScienceeNeuro
- 2022

This work discusses model selection and clustering techniques from a statistician’s point of view, revealing the assumptions behind, and the logic that governs the various approaches.

### Models for autoregressive processes of bounded counts: How different are they?

- MathematicsComput. Stat.
- 2020

The most popular approaches used for model identification, Akaike’s information criterion and the Bayesian information criterion are considered, and the properties of the fitted models obtained using maximum likelihood estimation are investigated.

### Impact of Algorithm Selection on Modeling Ozone Pollution: A Perspective on Box and Tiao (1975)

- Computer ScienceForests
- 2020

This study has predicted ozone concentration in Los Angeles with an ARIMA and an autoregressive process, and shows that time series analysis should consider not only the model shape but also the model estimation, to ensure valid results.

### Treatment response prediction: Is model selection unreliable?

- PsychologybioRxiv
- 2022

Quantitative modelling has become an essential part of the drug development pipeline. In particular, pharmacokinetic and pharmacodynamic models are used to predict treatment responses in order to…

### Exponentiated Weibull Models Applied to Medical Data in Presence of Right-censoring, Cure Fraction and Covariates

- MathematicsStatistics, Optimization & Information Computing
- 2021

Cure fraction models have been widely used to analyze survival data in which a proportion of the individuals isnot susceptible to the event of interest. This article considers frequentist and…

### Multivariate autoregressive model estimation for high-dimensional intracranial electrophysiological data

- Computer ScienceNeuroImage
- 2022

### Simple Models in Complex Worlds: Occam's Razor and Statistical Learning Theory

- Computer ScienceMinds Mach.
- 2022

It is shown that situations exist for which a preference for simpler models (as modeled through the addition of a regularization term in the learning problem) provably slows down, instead of favoring, the supervised learning process.

### Development of Effective Artificial Neural Network Model using Sequential Sensitivity Analysis and Randomized Training

- Computer Science
- 2021

The paper presents an effective model that focuses on the implementation of sequential sensitivity analysis and randomized training in an artificial neural network (ANN) for high dimensionality thermal power plant data and suggests the significance of training with randomized training with comparison-based qualitative reasoning.

## References

SHOWING 1-10 OF 47 REFERENCES

### Information Criteria and Statistical Modeling

- Computer Science
- 2007

A generalized information criterion (GIC) and a bootstrap information criterion are presented, which provide unified tools for modeling and model evaluation for a diverse range of models, including various types of nonlinear models and model estimation procedures such as robust estimation, the maximum penalized likelihood method and a Bayesian approach.

### An improved Akaike information criterion for state-space model selection

- Computer ScienceComput. Stat. Data Anal.
- 2006

### Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions

- Mathematics
- 1987

During the last fifteen years, Akaike's entropy-based Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the…

### Bootstrapping Log Likelihood and EIC, an Extension of AIC

- Computer Science
- 1997

A new information criterion, EIC, is proposed which is constructed by employing the bootstrap method to simulate the datafluctuation and is regarded as an extension of AIC.

### A new look at the statistical model identification

- Mathematics
- 1974

The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as…

### Model selection for extended quasi-likelihood models in small samples.

- MathematicsBiometrics
- 1995

A small sample criterion (AICc) for the selection of extended quasi-likelihood models provides a more nearly unbiased estimator for the expected Kullback-Leibler information and often selects better models than AIC in small samples.

### Improved estimators of Kullback-Leibler information for autoregressive model selection in small samples

- Mathematics
- 1990

SUMMARY A new estimator, AICI, of the Kullback-Leibler information is proposed for Gaussian autoregressive time series model selection. The expected information is decomposed into two terms, the…

### BOOTSTRAP ESTIMATE OF KULLBACK-LEIBLER INFORMATION FOR MODEL SELECTION

- Mathematics
- 1997

Estimation of Kullback-Leibler information is a crucial part of deriving a statistical model selection procedure which, like AIC, is based on the likelihood principle. To discriminate between nested…

### Generalised information criteria in model selection

- Business
- 1996

SUMMARY The problem of evaluating the goodness of statistical models is investigated from an information-theoretic point of view. Information criteria are proposed for evaluating models constructed…

### Akaike's Information Criterion in Generalized Estimating Equations

- MathematicsBiometrics
- 2001

This work proposes a modification to AIC, where the likelihood is replaced by the quasi-likelihood and a proper adjustment is made for the penalty term.