• Corpus ID: 224707196

Bayesian Neural Networks with Soft Evidence

@article{Yu2020BayesianNN,
  title={Bayesian Neural Networks with Soft Evidence},
  author={Edward Yu},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.09570}
}
  • Edward Yu
  • Published 19 October 2020
  • Computer Science
  • ArXiv
Bayes's rule deals with hard evidence, that is, we can calculate the probability of event $A$ occuring given that event $B$ has occurred. Soft evidence, on the other hand, involves a degree of uncertainty about whether event $B$ has actually occurred or not. Jeffrey's rule of conditioning provides a way to update beliefs in the case of soft evidence. We provide a framework to learn a probability distribution on the weights of a neural network trained using soft evidence by way of two simple… 

Figures from this paper

References

SHOWING 1-10 OF 26 REFERENCES

Bayesian Network Reasoning with Uncertain Evidences

Efficient algorithms for BN reasoning with consistent and inconsistent uncertain evidences are developed, and their convergences analyzed.

Weight Uncertainty in Neural Networks

This work introduces a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop, and shows how the learnt uncertainty in the weights can be used to improve generalisation in non-linear regression problems.

The Mathematics of Changing One's Mind, via Jeffrey's or via Pearl's Update Rule

  • B. Jacobs
  • Computer Science
    J. Artif. Intell. Res.
  • 2019
This account is based on a novel channel-based approach to Bayesian probability, and describes these two approaches as different ways of updating with soft evidence, highlighting their differences, similarities and applications.

A Framework for Iterated Belief Revision Using Possibilistic Counterparts to Jeffrey's Rule

This paper analyses the expressive power of two possibilistic counterparts to Jeffrey's rule for modeling belief revision in intelligent agents and shows that this rule can be used to recover several existing approaches proposed in knowledge base revision, such as adjustment, natural belief Revision, drastic belief revision, and the revision of an epistemic state by another epistemic states.

Jeffrey-like rules of conditioning for the Dempster-Shafer theory of evidence

On Calibration of Modern Neural Networks

It is discovered that modern neural networks, unlike those from a decade ago, are poorly calibrated, and on most datasets, temperature scaling -- a single-parameter variant of Platt Scaling -- is surprisingly effective at calibrating predictions.

Obtaining Well Calibrated Probabilities Using Bayesian Binning

A new non-parametric calibration method called Bayesian Binning into Quantiles (BBQ) is presented which addresses key limitations of existing calibration methods and can be readily combined with many existing classification algorithms.

Jeffrey’s rule of conditioning in a possibilistic framework

This paper addresses the existence and uniqueness of the solutions computed using the possibilistic counterparts of the so-called kinematics properties underlying Jeffrey’s rule of conditioning, and provides precise conditions where theiqueness of the revised possibility distribution exists.