Bayesian Neural Networks with Soft Evidence
@article{Yu2020BayesianNN, title={Bayesian Neural Networks with Soft Evidence}, author={Edward Yu}, journal={ArXiv}, year={2020}, volume={abs/2010.09570} }
Bayes's rule deals with hard evidence, that is, we can calculate the probability of event $A$ occuring given that event $B$ has occurred. Soft evidence, on the other hand, involves a degree of uncertainty about whether event $B$ has actually occurred or not. Jeffrey's rule of conditioning provides a way to update beliefs in the case of soft evidence. We provide a framework to learn a probability distribution on the weights of a neural network trained using soft evidence by way of two simple…
References
SHOWING 1-10 OF 26 REFERENCES
Bayesian Network Reasoning with Uncertain Evidences
- Computer ScienceInt. J. Uncertain. Fuzziness Knowl. Based Syst.
- 2010
Efficient algorithms for BN reasoning with consistent and inconsistent uncertain evidences are developed, and their convergences analyzed.
Belief revision generalized: A joint characterization of Bayes' and Jeffrey's rules
- EconomicsJ. Econ. Theory
- 2016
Weight Uncertainty in Neural Networks
- Computer ScienceArXiv
- 2015
This work introduces a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop, and shows how the learnt uncertainty in the weights can be used to improve generalisation in non-linear regression problems.
The Mathematics of Changing One's Mind, via Jeffrey's or via Pearl's Update Rule
- Computer ScienceJ. Artif. Intell. Res.
- 2019
This account is based on a novel channel-based approach to Bayesian probability, and describes these two approaches as different ways of updating with soft evidence, highlighting their differences, similarities and applications.
On the Revision of Probabilistic Beliefs using Uncertain Evidence
- EconomicsIJCAI
- 2003
A Framework for Iterated Belief Revision Using Possibilistic Counterparts to Jeffrey's Rule
- PhilosophyFundam. Informaticae
- 2010
This paper analyses the expressive power of two possibilistic counterparts to Jeffrey's rule for modeling belief revision in intelligent agents and shows that this rule can be used to recover several existing approaches proposed in knowledge base revision, such as adjustment, natural belief Revision, drastic belief revision, and the revision of an epistemic state by another epistemic states.
Jeffrey-like rules of conditioning for the Dempster-Shafer theory of evidence
- PsychologyInt. J. Approx. Reason.
- 1989
On Calibration of Modern Neural Networks
- Computer ScienceICML
- 2017
It is discovered that modern neural networks, unlike those from a decade ago, are poorly calibrated, and on most datasets, temperature scaling -- a single-parameter variant of Platt Scaling -- is surprisingly effective at calibrating predictions.
Obtaining Well Calibrated Probabilities Using Bayesian Binning
- Computer ScienceAAAI
- 2015
A new non-parametric calibration method called Bayesian Binning into Quantiles (BBQ) is presented which addresses key limitations of existing calibration methods and can be readily combined with many existing classification algorithms.
Jeffrey’s rule of conditioning in a possibilistic framework
- Computer ScienceAnnals of Mathematics and Artificial Intelligence
- 2011
This paper addresses the existence and uniqueness of the solutions computed using the possibilistic counterparts of the so-called kinematics properties underlying Jeffrey’s rule of conditioning, and provides precise conditions where theiqueness of the revised possibility distribution exists.