Generalized estimators, slope, efficiency, and fisher information bounds

@article{Vos2022GeneralizedES,
  title={Generalized estimators, slope, efficiency, and fisher information bounds},
  author={Paul Vos},
  journal={Information Geometry},
  year={2022}
}
  • Paul Vos
  • Published 7 August 2022
  • Mathematics
  • Information Geometry
Point estimators may not exist, need not be unique, and their distributions are not parameter invariant. Generalized estimators provide distributions that are parameter invariant, unique, and exist when point estimates do not. Comparing point estimators using variance is less use-ful when estimators are biased. A squared slope Λ is defined that can be used to compare both point and generalized estimators and is unaffected by bias. Fisher information I and variance are fundamentally different… 

Maximum likelihood estimators uniformly minimize distribution variance among distribution unbiased estimators in exponential families

We employ a parameter-free distribution estimation framework where estimators are random distributions and utilize the Kullback-Leibler (KL) divergence as a loss function. Wu and Vos [J. Statist.

Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information

SUMMARY This paper concerns normal approximations to the distribution of the maximum likelihood estimator in one-parameter families. The traditional variance approximation is 1/1.I, where 0 is the

Interval Estimation for a Binomial Proportion

We revisit the problem of interval estimation of a binomial proportion. The erratic behavior of the coverage probability of the stan- d ardWaldconfid ence interval has previously been remarkedon in

Theory of Statistical Estimation

  • R. Fisher
  • Philosophy
    Mathematical Proceedings of the Cambridge Philosophical Society
  • 1925
It has been pointed out to me that some of the statistical ideas employed in the following investigation have never received a strictly logical definition and analysis, and it is desirable to set out for criticism the manner in which the logical foundations of these ideas may be established.

The theory and applications of statistical inference functions

1: Introduction.- 2: The Space of Inference Functions: Ancillarity, Sufficiency and Projection.- 2.1 Basic definitions.- 2.2 Projections and product sets.- 2.3 Ancillarity, sufficiency and projection

STATISTICAL METHODS AND SCIENTIFIC INDUCTION

SUMMARY THE attempt to reinterpret the common tests of significance used in scientific research as though they constituted some kind of acceptance procedure and led to "decisions" in Wald's sense,

Frequentist statistical inference without repeated sampling

Frequentist inference typically is described in terms of hypothetical repeated sampling but there are advantages to an interpretation that uses a single random sample. Contemporary examples are given

Variance of the Median of Samples from a Cauchy Distribution

Abstract Exact values of the variances of the medians of small samples from a Cauchy distribution are given. They are compared with the values obtained by using the formula for asymptotic variance.

A closed-form formula for the Kullback-Leibler divergence between Cauchy distributions

The formula shows that the Kullback-Leibler divergence between Cauchy densities is always finite and symmetric.