The Relation Between Bayesian Fisher Information and Shannon Information for Detecting a Change in a Parameter

@article{Clarkson2019TheRB,
  title={The Relation Between Bayesian Fisher Information and Shannon Information for Detecting a Change in a Parameter},
  author={Eric Clarkson},
  journal={Journal of the Optical Society of America. A, Optics, image science, and vision},
  year={2019},
  volume={36 7},
  pages={
          1209-1214
        }
}
  • E. Clarkson
  • Published 31 January 2019
  • Computer Science
  • Journal of the Optical Society of America. A, Optics, image science, and vision
We derive a connection between the performance of statistical estimators and the performance of the ideal observer on related detection tasks. Specifically, we show how the task-specific Shannon information for the task of detecting a change in a parameter is related to the Fisher information and to the Bayesian Fisher information. We have previously shown that this Shannon information is related via an integral transform to the minimum probability of error on the same task. We then outline a… 

Probability of error for detecting a change in a parameter and Bayesian Fisher information.

  • E. Clarkson
  • Mathematics, Computer Science
    Journal of the Optical Society of America. A, Optics, image science, and vision
  • 2020
This work derives an inequality that relates this minimum probability of error to the Bayesian version of the Fisher information, and discovers that an important intermediary in the calculation is the total variation of the posterior probability distribution function for the parameter given the data.

Bayesian Fisher Information and Detection of a Small Change in a Parameter

  • E. Clarkson
  • Computer Science
    2020 54th Annual Conference on Information Sciences and Systems (CISS)
  • 2020
This work determines the lowest order approximation of the average of the Bayesian Risk for the detection task using the Ziv-Zakai inequality.

Bayesian Fisher Information, Shannon Information, and ROC Analysis for Classification Tasks

  • E. Clarkson
  • Computer Science
    2020 54th Asilomar Conference on Signals, Systems, and Computers
  • 2020
This work determines the lowest order approximation of the average of the Bayesian Risk for the detection task using the Ziv-Zakai inequality.

Inequalities and Approximations for Fisher Information in the Presence of Nuisance Parameters

This work mariginalizes over the nuisance parameters to produce a conditional PDF for the data that only depends on the parameters of interest, and examines this approach to develop inequalities and approximations for the FIM when the data is affected by nuisance parameters.

Quantifying the Loss of Information from Binning List-Mode Data

  • E. Clarkson
  • Physics
    Journal of the Optical Society of America. A, Optics, image science, and vision
  • 2020
This work shows that the binning operation should result in a loss of information for Fisher information and provides a computational method for quantifying the information loss, and finds that theInformation loss depends on three factors.

Single-Atom Quantum Probes for Ultracold Gases Boosted by Nonequilibrium Spin Dynamics

Quantum probes are atomic-sized devices mapping information of their environment to quantum mechanical states. By improving measurements and at the same time minimizing perturbation of the

References

SHOWING 1-10 OF 16 REFERENCES

Shannon information and ROC analysis in imaging.

A new ROC curve is described, the Shannon information receiver operator curve (SIROC), that is derived from the SI expression for a binary classification task, and it is shown that the ideal-observer R OC curve and the SIROC have many properties in common, and are equivalent descriptions of the optimal performance of an observer on the task.

Fisher information and surrogate figures of merit for the task-based assessment of image quality.

  • E. ClarksonFangfang Shen
  • Computer Science
    Journal of the Optical Society of America. A, Optics, image science, and vision
  • 2010
A new and improved derivation of the Fisher information approximation for ideal-observer detectability is provided and applications of this approximation to imaging mixture models show a relation with the pure detection and pure estimation tasks for the same signals.

Shannon information and receiver operating characteristic analysis for multiclass classification in imaging.

It is shown how Shannon information is mathematically related to receiver operating characteristic (ROC) analysis for multiclass classification problems in imaging and that both hypersurfaces are convex and satisfy other geometrical relationships via the Legendre transform.

Asymptotic ideal observers and surrogate figures of merit for signal detection with list-mode data.

  • E. Clarkson
  • Computer Science
    Journal of the Optical Society of America. A, Optics, image science, and vision
  • 2012
The asymptotic form for the likelihood ratio is derived for list-mode data generated by an imaging system viewing a possible signal in a randomly generated background to derive surrogate figures of merit, quantities that are correlated with ideal-observer performance on detection tasks, but are easier to compute.

Using Fisher information to approximate ideal-observer performance on detection tasks for lumpy-background images.

  • Fangfang ShenE. Clarkson
  • Computer Science
    Journal of the Optical Society of America. A, Optics, image science, and vision
  • 2006
This work develops approximations to the ideal-observer detectability as a function of signal parameters involving the Fisher information matrix, which is normally used in parameter estimation problems.

Applications of the van Trees inequality : a Bayesian Cramr-Rao bound

We use a Bayesian version of the Cramer-Rao lower bound due to van Trees to give an elementary proof that the limiting distibution of any regular estimator cannot have a variance less than the

Extended Ziv-Zakai lower bound for vector parameter estimation

The Bayesian Ziv-Zakai bound on the mean square error (MSE) in estimating a uniformly distributed continuous random variable is extended for arbitrarily distributed continuous random vectors and for

Some Lower Bounds on Signal Parameter Estimation

New bounds are presented for the maximum accuracy with which parameters of signals imbedded in white noise can be estimated, which are independent of the bias and include explicitly the dependence on the a priori interval.

Elements of Information Theory

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.

Objective assessment of image quality. III. ROC metrics, ideal observers, and likelihood-generating functions.

All moments of both the likelihood and the log likelihood under both hypotheses can be derived from this one function, and the AUC can be expressed, to an excellent approximation, in terms of the likelihood-generating function evaluated at the origin.