# Monte Carlo Computation of the Fisher Information Matrix in Nonstandard Settings

@article{Spall2005MonteCC, title={Monte Carlo Computation of the Fisher Information Matrix in Nonstandard Settings}, author={James C. Spall}, journal={Journal of Computational and Graphical Statistics}, year={2005}, volume={14}, pages={889 - 909} }

The Fisher information matrix summarizes the amount of information in the data relative to the quantities of interest. There are many applications of the information matrix in modeling, systems analysis, and estimation, including confidence region calculation, input design, prediction bounds, and “noninformative” priors for Bayesian analysis. This article reviews some basic principles associated with the information matrix, presents a resampling-based method for computing the information matrix…

## 73 Citations

### Improved methods for Monte Carlo estimation of the fisher information matrix

- Computer Science2008 American Control Conference
- 2008

It is shown how certain properties associated with the likelihood function and the error in the estimates of the Hessian matrix can be exploited to improve the accuracy of the Monte Carlo- based estimate of the information matrix.

### On Monte Carlo methods for estimating the fisher information matrix in difficult problems

- Computer Science2009 43rd Annual Conference on Information Sciences and Systems
- 2009

It is shown how certain properties associated with the likelihood function and the error in the estimates of the Hessian matrix can be exploited to improve the accuracy of the Monte Carlo-based estimate of the information matrix.

### Enhanced Monte Carlo Estimation of the Fisher Information Matrix with Independent Perturbations for Complex Problems

- Computer Science
- 2021

This paper presents an enhanced resampling-based method with independent simultaneous perturbations to estimate the Fisher information matrix and conducts theoretical and numerical analysis to show its accuracy via variance reduction from O(1/N) to O( 1/(nN)), where n is the sample size of the data and N is a measure of the Monte Carlo averaging.

### Demonstration of enhanced Monte Carlo computation of the fisher information for complex problems

- Computer Science2013 American Control Conference
- 2013

This short paper reviews a feedback-based method and an independent perturbation approach for computing the information matrix for complex problems, where a closed form of the Information matrix is not achievable.

### Efficient Monte Carlo computation of Fisher information matrix using prior information

- Computer ScienceComput. Stat. Data Anal.
- 2010

The estimator of the FIM, obtained by using the proposed algorithm, simultaneously preserves the analytically known elements and reduces the variances of the estimators of the unknown elements by capitalizing on the information contained in the known elements.

### Improved Monte Carlo Estimation of the Fisher Information Matrix with Independent Perturbations

- Computer Science, Mathematics2021 55th Annual Conference on Information Sciences and Systems (CISS)
- 2021

An enhanced resampling-based method with independent perturbation to estimate the Fisher information matrix is presented, showing its accuracy via variance reduction that is reduced by a factor of n, where $n$ is the sample size.

### Efficient computation of the Fisher information matrix in the EM algorithm

- Computer Science, Mathematics2017 51st Annual Conference on Information Sciences and Systems (CISS)
- 2017

A simple Monte Carlo-based method requiring only the gradient values of the function obtained from the E step and basic operations to estimate the Hessian matrix from the gradient of the conditional expectation of the complete-data log-likelihood function.

### An efficient calculation of Fisher information matrix: Monte Carlo approach using prior information

- Computer Science2007 46th IEEE Conference on Decision and Control
- 2007

The estimator of the FIM, obtained by using the proposed MC algorithm, simultaneously preserves the analytically known elements and reduces the variances of the estimators of the unknown elements by capitalizing on the information contained in the known elements.

### Method for Computation of the Fisher Information Matrix in the Expectation-Maximization Algorithm

- Computer Science, Mathematics
- 2016

A simple Monte Carlo-based method to utilize the simultaneous perturbation stochastic approximation method to approximate the Hessian matrix from the gradient of the conditional expectation of the complete-data log-likelihood function.

### Analysis of data-based methods for approximating fisher information in the scalar case

- Computer Science, Mathematics2015 49th Annual Conference on Information Sciences and Systems (CISS)
- 2015

In this paper, the accuracy of two major approximation methods for FIN are compared and numerical results as illustrations to the authors' analysis are provided.

## References

SHOWING 1-10 OF 24 REFERENCES

### Adaptive stochastic approximation by the simultaneous perturbation method

- Computer ScienceProceedings of the 37th IEEE Conference on Decision and Control (Cat. No.98CH36171)
- 1998

This paper presents a general adaptive SA algorithm that is based on an easy method for estimating the Hessian matrix at each iteration while concurrently estimating the primary parameters of interest.

### A maximum likelihood algorithm for the mean and covariance of nonidentically distributed observations

- Mathematics
- 1982

An iterative procedure for computing the maximum likelihood estimates of the mean and the covariance of a normal random vector, based on nonidentically distributed observations, is developed. The…

### Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information

- Mathematics
- 1978

SUMMARY This paper concerns normal approximations to the distribution of the maximum likelihood estimator in one-parameter families. The traditional variance approximation is 1/1.I, where 0 is the…

### Multivariate stochastic approximation using a simultaneous perturbation gradient approximation

- Computer Science, Mathematics
- 1992

The paper presents an SA algorithm that is based on a simultaneous perturbation gradient approximation instead of the standard finite-difference approximation of Keifer-Wolfowitz type procedures that can be significantly more efficient than the standard algorithms in large-dimensional problems.

### Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control

- Computer ScienceTechnometrics
- 2004

This book is suitable for a short course due to its expository nature and the material covered is of current interest, the informal tone is pleasing to the reader, and the author provides several insightful comments.

### Filtering, predictive, and smoothing Cramér-Rao bounds for discrete-time nonlinear dynamic systems

- Engineering, MathematicsAutom.
- 2001

### Mathematical Statistics: Basic Ideas and Selected Topics

- Mathematics
- 1977

(NOTE: Each chapter concludes with Problems and Complements, Notes, and References.) 1. Statistical Models, Goals, and Performance Criteria. Data, Models, Parameters, and Statistics. Bayesian Models.…

### Data Analysis by Resampling: Concepts and Applications

- BusinessTechnometrics
- 2001

This book is found to be a comprehensive discussion of methods that can be used to perform both machine and process capability studies, and readers, particularly statisticians, should be a worthwhile addition to their libraries.

### Antithetic coupling of two Gibbs sampler chains

- Mathematics
- 2000

Two coupled Gibbs sampler chains, both with invariant probability density π, are run in parallel so that the chains are negatively correlated. We define an asymptotically unbiased estimator of the…

### Linear Statistical Inference and Its Applications

- Mathematics
- 1966

"C. R. Rao would be found in almost any statistician's list of five outstanding workers in the world of Mathematical Statistics today. His book represents a comprehensive account of the main body of…