Conditioning, Likelihood, and Coherence: A Review of Some Foundational Concepts

@article{Robins2000ConditioningLA,
  title={Conditioning, Likelihood, and Coherence: A Review of Some Foundational Concepts},
  author={James M. Robins and Larry A. Wasserman},
  journal={Journal of the American Statistical Association},
  year={2000},
  volume={95},
  pages={1340 - 1346}
}
  • J. Robins, L. Wasserman
  • Published 1 December 2000
  • Mathematics
  • Journal of the American Statistical Association
(2000). Conditioning, Likelihood, and Coherence: A Review of Some Foundational Concepts. Journal of the American Statistical Association: Vol. 95, No. 452, pp. 1340-1346. 
Comment on: Reflections on the Probability Space Induced by Moment Conditions with Implications for Bayesian Inference
This note is commenting on Ronald Gallant’s (2015) reflections on the construction of Bayesian prior distributions from moment conditions. The main conclusion is that the paper does not deliver aExpand
A frequentist framework of inductive reasoning
A betting game establishes a sense in which confidence measures, confidence distributions in the form of probability measures, are the only reliable inferential probability distributions. InExpand
Some comments about A. Ronald Gallant's "Reflections on the Probability Space Induced by Moment Conditions with Implications for Bayesian Inference"
This note is commenting on Ronald Gallant's (2015) reflections on the construction of Bayesian prior distributions from moment conditions. The main conclusion is that the paper does not deliver aExpand
The interplay of Bayesian and frequentist analysis
TLDR
This article embarks upon a rather idiosyncratic walk through some of the fundamental philosophical and pedagogical issues at stake in the debate over whether the Bayesian or frequentist paradigm is superior. Expand
An Error in the Argument from Conditionality and Sufficiency to the Likelihood Principle
It is not uncommon to see statistics texts argue that in frequentist theory one is faced with the following dilemma, either to deny the appropriateness of conditioning on the precision of the toolExpand
Discussion of "On Bayesian estimation of marginal structural models".
TLDR
This paper presents a simplified version of Saarela et al.'s (Saarela and Wasserman, 2000) work on integrating propensity scores into a Bayesian framework that captures the main points. Expand
Identifiability, exchangeability and confounding revisited
TLDR
This article reviews the article from the perspective of a quarter century after it was first drafted and relates it to subsequent developments on confounding, ignorability, and collapsibility. Expand
The Scaled Uniform Model Revisited
  • M. Mandel
  • Computer Science, Mathematics
  • The American Statistician
  • 2019
TLDR
The scaled uniform model is used here to demonstrate the importance and usefulness of the conditionality principle, which is probably the most basic and less familiar among the three. Expand
Statistical inference : Paradigms and controversies in historic perspective
1. Early Bayesian inference and its revival Inverse probability – Non-informative priors – “Objective” Bayes (1763), Laplace (1774), Jeffreys (1931), Bernardo (1975) 2. Fisherian inference EvidenceExpand
Calibrated Bayes
The lack of an agreed inferential basis for statistics makes life “interesting” for academic statisticians, but at the price of negative implications for the status of statistics in industry,Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 75 REFERENCES
Consistency of Bayes Estimates for Nonparametric Regression: A Review
This paper reviews some recent studies of frequentist properties of Bayes estimates. In nonparametric regression, natural priors can lead to inconsistent estimators; although in some problems, suchExpand
R.A. Fisher and the making of maximum likelihood 1912-1922
In 1922 R. A. Fisher introduced the method of maximum likelihood. He first presented the numerical procedure in 1912. This paper considers Fisher's changing justifications for the method, theExpand
Testing Statistical Hypotheses
The General Decision Problem.- The Probability Background.- Uniformly Most Powerful Tests.- Unbiasedness: Theory and First Applications.- Unbiasedness: Applications to Normal Distributions.-Expand
What Did Fisher Mean by "Inverse Probability" in 1912-1922?
The method of maximum likelihood was introduced by R. A. Fisher in 1912, but not until 1922 under that name. This paper seeks to elucidate what Fisher understood by the phrase "inverse probability,"Expand
Conditional Confidence Statements and Confidence Estimators
Abstract Procedures are given for assessing the conclusiveness of a decision in terms of a (chance) conditional confidence coefficient Γ that has frequentist interpretability analogous to that of aExpand
De Finetti's Coherence and Statistical Inference
On donne la definition d'une inference coherente en conformite avec une theorie des probabilites conditionnelles derivee du principe de coherence de De Finetti et on effectue une comparaison critiqueExpand
On principles and arguments to likelihood
Birnbaum (1962a) argued that the conditionality principle (C) and the sufficiency principle (S) implied the likelihood principle (L); he then argued (Birnbaum 1972) that C and a mathematicalExpand
POSTERIOR CONSISTENCY OF DIRICHLET MIXTURES IN DENSITY ESTIMATION
A Dirichlet mixture of normal densities is a useful choice for a prior distribution on densities in the problem of Bayesian density estimation. In the recent years, efficient Markov chain Monte CarloExpand
A simple ancillarity paradox
For the problem of estimating the mean of a univariate normal distribution with known variance, the maximum likelihood estimator (MLE) is best invariant, minimax, and admissible under squared-errorExpand
...
1
2
3
4
5
...