Monte Carlo Computation of the Fisher Information Matrix in Nonstandard Settings

@article{Spall2005MonteCC,
  title={Monte Carlo Computation of the Fisher Information Matrix in Nonstandard Settings},
  author={James C. Spall},
  journal={Journal of Computational and Graphical Statistics},
  year={2005},
  volume={14},
  pages={889 - 909}
}
  • J. Spall
  • Published 2005
  • Mathematics
  • Journal of Computational and Graphical Statistics
The Fisher information matrix summarizes the amount of information in the data relative to the quantities of interest. There are many applications of the information matrix in modeling, systems analysis, and estimation, including confidence region calculation, input design, prediction bounds, and “noninformative” priors for Bayesian analysis. This article reviews some basic principles associated with the information matrix, presents a resampling-based method for computing the information matrix… Expand
Improved methods for Monte Carlo estimation of the fisher information matrix
  • J. Spall
  • Mathematics, Computer Science
  • 2008 American Control Conference
  • 2008
TLDR
It is shown how certain properties associated with the likelihood function and the error in the estimates of the Hessian matrix can be exploited to improve the accuracy of the Monte Carlo- based estimate of the information matrix. Expand
On Monte Carlo methods for estimating the fisher information matrix in difficult problems
  • J. Spall
  • Mathematics, Computer Science
  • 2009 43rd Annual Conference on Information Sciences and Systems
  • 2009
TLDR
It is shown how certain properties associated with the likelihood function and the error in the estimates of the Hessian matrix can be exploited to improve the accuracy of the Monte Carlo-based estimate of the information matrix. Expand
Enhanced Monte Carlo Estimation of the Fisher Information Matrix with Independent Perturbations for Complex Problems
The Fisher information matrix provides a way to measure the amount of information given observed data based on parameters of interest. Many applications of the FIM exist in statistical modeling,Expand
Demonstration of enhanced Monte Carlo computation of the fisher information for complex problems
  • Xumeng Cao
  • Computer Science, Mathematics
  • 2013 American Control Conference
  • 2013
TLDR
This short paper reviews a feedback-based method and an independent perturbation approach for computing the information matrix for complex problems, where a closed form of the Information matrix is not achievable. Expand
Efficient Monte Carlo computation of Fisher information matrix using prior information
TLDR
The estimator of the FIM, obtained by using the proposed algorithm, simultaneously preserves the analytically known elements and reduces the variances of the estimators of the unknown elements by capitalizing on the information contained in the known elements. Expand
Efficient Monte Carlo computation of Fisher information matrix using prior information
TLDR
The estimator of the FIM resulting from the proposed algorithm simultaneously preserves the analytically known elements and reduces variances of the estimators of the unknown elements by capitalizing on the information contained in the known elements. Expand
Improved Monte Carlo Estimation of the Fisher Information Matrix with Independent Perturbations
  • Xuan-Wei Wu, J. Spall
  • Computer Science
  • 2021 55th Annual Conference on Information Sciences and Systems (CISS)
  • 2021
TLDR
An enhanced resampling-based method with independent perturbation to estimate the Fisher information matrix is presented, showing its accuracy via variance reduction that is reduced by a factor of n, where $n$ is the sample size. Expand
Efficient computation of the Fisher information matrix in the EM algorithm
  • Lingyao Meng, J. Spall
  • Mathematics, Computer Science
  • 2017 51st Annual Conference on Information Sciences and Systems (CISS)
  • 2017
TLDR
A simple Monte Carlo-based method requiring only the gradient values of the function obtained from the E step and basic operations to estimate the Hessian matrix from the gradient of the conditional expectation of the complete-data log-likelihood function. Expand
An efficient calculation of Fisher information matrix: Monte Carlo approach using prior information
TLDR
The estimator of the FIM, obtained by using the proposed MC algorithm, simultaneously preserves the analytically known elements and reduces the variances of the estimators of the unknown elements by capitalizing on the information contained in the known elements. Expand
Method for Computation of the Fisher Information Matrix in the Expectation-Maximization Algorithm
The expectation-maximization (EM) algorithm is an iterative computational method to calculate the maximum likelihood estimators (MLEs) from the sample data. It converts a complicated one-timeExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 30 REFERENCES
Adaptive stochastic approximation by the simultaneous perturbation method
  • J. Spall
  • Mathematics, Computer Science
  • IEEE Trans. Autom. Control.
  • 2000
TLDR
The paper presents a general adaptive SA algorithm that is based on a simple method for estimating the Hessian matrix, while concurrently estimating the primary parameters of interest, based on the "simultaneous perturbation (SP)" idea introduced previously. Expand
A maximum likelihood algorithm for the mean and covariance of nonidentically distributed observations
An iterative procedure for computing the maximum likelihood estimates of the mean and the covariance of a normal random vector, based on nonidentically distributed observations, is developed. TheExpand
Sampling-Based Approaches to Calculating Marginal Densities
Abstract Stochastic substitution, the Gibbs sampler, and the sampling-importance-resampling algorithm can be viewed as three alternative sampling- (or Monte Carlo-) based approaches to theExpand
Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information
SUMMARY This paper concerns normal approximations to the distribution of the maximum likelihood estimator in one-parameter families. The traditional variance approximation is 1/1.I, where 0 is theExpand
Multivariate stochastic approximation using a simultaneous perturbation gradient approximation
The problem of finding a root of the multivariate gradient equation that arises in function minimization is considered. When only noisy measurements of the function are available, a stochasticExpand
Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control
TLDR
This book is suitable for a short course due to its expository nature and the material covered is of current interest, the informal tone is pleasing to the reader, and the author provides several insightful comments. Expand
Filtering, predictive, and smoothing Cramér-Rao bounds for discrete-time nonlinear dynamic systems
TLDR
Recursive relations for filtering, predictive, and smoothing Cramer-Rao bounds are derived to establish a unifying framework for several previously published derivation procedures and results. Expand
Estimation and tests of hypotheses for the initial mean and covariance in the kalman filter model
Kalman filtering techniques are widely used by engineers to recursively estimate random signal parameters which are essentially coefficients in a large-scale time series regression model. TheseExpand
Mathematical Statistics: Basic Ideas and Selected Topics
(NOTE: Each chapter concludes with Problems and Complements, Notes, and References.) 1. Statistical Models, Goals, and Performance Criteria. Data, Models, Parameters, and Statistics. Bayesian Models.Expand
Antithetic coupling of two Gibbs sampler chains
Two coupled Gibbs sampler chains, both with invariant probability density π, are run in parallel so that the chains are negatively correlated. We define an asymptotically unbiased estimator of theExpand
...
1
2
3
...