Quantifying Uncertainty in Deep Learning Approaches to Radio Galaxy Classification

@inproceedings{Mohan2022QuantifyingUI,
  title={Quantifying Uncertainty in Deep Learning Approaches to Radio Galaxy Classification},
  author={Devina Mohan and Anna M. M. Scaife and Fiona Porter and Mike Walmsley and Micah Bowles},
  year={2022}
}
In this work we use variational inference to quantify the degree of uncertainty in deep learning model predictions of radio galaxy classification. We show that the level of model posterior variance for individual test samples is correlated with human uncertainty when labelling radio galaxies. We explore the model performance and uncertainty calibration for a variety of different weight priors and suggest that a sparse prior produces more well-calibrated uncertainty estimates. Using the… 

Using Bayesian Deep Learning to Infer Planet Mass from Gaps in Protoplanetary Disks

TLDR
A Bayesian deep-learning network, “DPNNet-Bayesian,” is introduced which can predict planet mass from disk gaps and also provides the uncertainties associated with the prediction, when applied to unknown observations.

Probabilistic learning for pulsar classification

TLDR
It is shown how, in the case of relatively small amount of training dataset, a convolutional neural network based classifier trained via Bayesian Active Learning by Disagreement (BALD) performs and that, with an optimized number of training examples, the model generalizes relatively well and produces the best uncertainty calibration.

References

SHOWING 1-8 OF 8 REFERENCES

Uncertainty in Deep Learning

TLDR
This work develops tools to obtain practical uncertainty estimates in deep learning, casting recent deep learning tools as Bayesian models without changing either the models or the optimisation, and develops the theory for such tools.

Publications of the Astronomical Society of Australia

: In this review we compare the three existing sets of theoretical yields of zero metal massive stars available in the literature. We also show how each of these three different sets of yields fits

Learning with Graphical Models

TLDR
Learning of thin junction trees is reviewed–a class of graphical models that permits efficient inference and particular cases in clique graphs where exact inference is possible in polynomial time and some special cases where good approximation guarantees can be given.

Annals of Mathematical Statistics

Advances in neural information

  • 2007

Experimental Astronomy

  • 2008

arXiv preprint arXiv:1412.6115 Gopal-Krishna

  • Wiita P. J.,
  • 2014

Information Fusion Aitchison L

  • International Conference on Learning Representations
  • 2021