Updating Subjective Probability

@article{Diaconis1982UpdatingSP,
  title={Updating Subjective Probability},
  author={Persi Diaconis and Sandy L. Zabell},
  journal={Journal of the American Statistical Association},
  year={1982},
  volume={77},
  pages={822-830}
}
  • P. Diaconis, S. Zabell
  • Published 1 December 1982
  • Mathematics
  • Journal of the American Statistical Association
Abstract Jeffrey's rule for revising a probability P to a new probability P* based on new probabilities P* (Ei ) on a partition {Ei } i = 1 n is P*(A) = Σ P(A| Ei ) P* (Ei ). Jeffrey's rule is applicable if it is judged that P* (A | Ei ) = P(A | Ei ) for all A and i. This article discusses some of the mathematical properties of this rule, connecting it with sufficient partitions, and maximum entropy updating of contingency tables. The main results concern simultaneous revision on two partitions… Expand
Belief revision generalized: A joint characterization of Bayes' and Jeffrey's rules
TLDR
A general framework for representing belief-revision rules is presented and it is used to characterize Bayes' rule as a classical example and Jeffrey'sRule as a non-classical one. Expand
An Information-Based Model for Subjective Probability
SYNOPTIC ABSTRACTIntuitive subjective probability is commonly assumed to represent an individual's comparative probability judgments of the events in some algebra. Conditions under which theseExpand
On the Revision of Probabilistic Beliefs using Uncertain Evidence
TLDR
This work revisits the problem of revising probabilistic beliefs using uncertain evidence, and focuses on Jeffrey's rule of probability kinematics and Pearl's method of virtual evidence, where these methods are analyzed and unify. Expand
Bayesian Rules of Updating
This paper discusses the Bayesian updating rules of ordinary and Jeffrey conditionalisation. Their justification has been a topic of interest for the last quarter century, and several strategiesExpand
Jeffrey's Rule, Passage of Experience, and Neo-Bayesianism
A technically convenient assumption underlying most of probabilistic epistemology is that the state of beliefs of a rational agent can be represented by a coherent probability function P,defined overExpand
Conditioning on uncertain event: Extensions to bayesian inference
In this paper the alternative procedure for updating probabilities (that is, to calculate the posterior distribution from the prior distribution) proposed by Richard Jeffrey is considered, whichExpand
Revisiting the Problem of Belief Revision with Uncertain Evidence
We revisit the problem of revising probabilistic beliefs using uncertain evidence, and report results on four major issues relating to this problem: How to specify uncertain evidence? How to revise aExpand
Updating, supposing, and maxent
ConclusionThe philosophical controversy concerning the logical status of MAXENTmay be in large measure due to the conflation of two distinct logical roles:(1) A general inductive principle forExpand
Probability Update: Conditioning vs. Cross-Entropy
TLDR
It is argued that--contrary to the suggestions in the literature--it is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition, which contrasts with proposals such as cross-entropy, which are easier to apply but can give unsatisfactory answers. Expand
Epistemology probabilized
Here is a framework for judgment in terms of a continuum of "subjective" probabilities, a framework in which probabilistic judgments need not stand on a foundation of certainties. In place ofExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 43 REFERENCES
Toward an Optimization Procedure for Applying Minimum Change Principles in Probability Kinematic
“Probability kinematics” is Richard Jeffrey’s term for the study of how a rational agent ought to revise his beliefs in response to inputs from experience.1 Typically, we have P0 representing aExpand
Two Theories of Probability
  • G. Shafer
  • Computer Science
  • PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association
  • 1978
The theory of belief functions differs from the Bayesian theory in that it uses certain non-additive set functions in the place of additive probability distributions and in that it generalizes theExpand
A Note on Jeffrey Conditionalization
Bayesian decision theory can be viewed as the core of psychological theory for idealized agents. To get a complete psychological theory for such agents, you have to supplement it with input andExpand
Rational Belief and Probability Kinematics
A general form is proposed for epistemological theories, the relevant factors being: the family of epistemic judgments, the epistemic state, the epistemic commitment (governing change of state), andExpand
Probability kinematics: A constrained optimization problem
  • S. May
  • Mathematics, Computer Science
  • J. Philos. Log.
  • 1976
TLDR
The following hypothesis is made: given an appropriate measure of nearness for subjective probability functions, the probability function the individual should adopt is the one closest to his original which is consistent with the new information. Expand
Jeffrey's Rule of Conditioning
  • G. Shafer
  • Mathematics
  • Philosophy of Science
  • 1981
Richard Jeffrey's generalization of Bayes' rule of conditioning follows, within the theory of belief functions, from Dempster's rule of combination and the rule of minimal extension. Both Jeffrey'sExpand
A mathematical theory of evidence
TLDR
This book develops an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. Expand
Conditionalization, Observation, and Change of Preference
TLDR
This chapter discusses the justification of condition (ii) from the point of view of a frequency interpretation of probability or reasonable degree of belief, and discusses the connection between change of belief and change of preference. Expand
Slightly More Realistic Personal Probability
  • I. Hacking
  • Computer Science
  • Philosophy of Science
  • 1967
A person required to risk money on a remote digit of π would, in order to comply fully with the theory [of personal probability] have to compute that digit, though this would really be wasteful ifExpand
Bayesian Conditionalisation and the Principle of Minimum Information
  • P. M. Williams
  • Mathematics
  • The British Journal for the Philosophy of Science
  • 1980
The use of the principle of minimum information, or equivalently the principle of maximum entropy, has been advocated by a number of authors over recent years both in statistical physics as well asExpand
...
1
2
3
4
5
...