New Confidence Intervals and Bias Comparisons Show That Maximum Likelihood Can Beat Multiple Imputation in Small Samples

@article{Hippel2016NewCI,
  title={New Confidence Intervals and Bias Comparisons Show That Maximum Likelihood Can Beat Multiple Imputation in Small Samples},
  author={P. V. Hippel},
  journal={Structural Equation Modeling},
  year={2016},
  volume={23},
  pages={422-437}
}
  • P. V. Hippel
  • Published 2016
  • Mathematics
  • Structural Equation Modeling
  • When analyzing incomplete data, is it better to use multiple imputation (MI) or full information maximum likelihood (ML)? In large samples ML is clearly better, but in small samples ML’s usefulness has been limited because ML commonly uses normal test statistics and confidence intervals that require large samples. We propose small-sample t-based ML confidence intervals that have good coverage and are shorter than t-based confidence intervals under MI. We also show that ML point estimates are… CONTINUE READING

    Tables from this paper.

    Brief Research Report: Growth Models With Small Samples and Missing Data
    4
    A comparison of techniques for handling missing data in scenarios with different missing data mechanisms
    • X. Li
    • Computer Science
    • 2015
    Assessing the Fit of Structural Equation Models With Multiply Imputed Data
    21
    Exploratory Factor Analysis With Small Samples and Missing Data
    23

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 53 REFERENCES
    The Bias and Efficiency of Incomplete-Data Estimators in Small Univariate Normal Samples
    11
    ML Versus MI for Missing Data With Violation of Distribution Conditions
    63
    What Improves with Increased Missing Data Imputations?
    372
    Approximately calibrated small sample inference about means from bivariate normal data with missing values
    8
    Large-sample theory for parametric multiple imputation procedures
    146
    Inference about means from incomplete multivariate data
    39
    On Obtaining Estimates of the Fraction of Missing Information From Full Information Maximum Likelihood
    36