A Review of Uncertainty Quantification in Deep Learning: Techniques, Applications and Challenges

@article{Abdar2021ARO,
  title={A Review of Uncertainty Quantification in Deep Learning: Techniques, Applications and Challenges},
  author={Moloud Abdar and Farhad Pourpanah and Sadiq Hussain and Dana Rezazadegan and Li Liu and Mohammad Ghavamzadeh and Paul W. Fieguth and Xiaochun Cao and Abbas Khosravi and U. Rajendra Acharya and Vladimir Makarenkov and Saeid Nahavandi},
  journal={Inf. Fusion},
  year={2021},
  volume={76},
  pages={243-297}
}
Recent advances and clinical applications of deep learning in medical image analysis
A Survey of Uncertainty in Deep Neural Networks
TLDR
A comprehensive introduction to the most crucial sources of uncertainty in neural networks is given and their separation into reducible model uncertainty and not reducible data uncertainty is presented.
The information of attribute uncertainties: what convolutional neural networks can learn about errors in input data
TLDR
It is shown how Convolutional Neural Networks (CNNs) are able to learn about the context and patterns of signal and noise, leading to improvements in the performance of classification methods.
Recent advances and applications of deep learning methods in materials science
TLDR
A high-level overview of deep learning methods followed by a detailed discussion of recent developments ofdeep learning in atomistic simulation, materials imaging, spectral analysis, and natural language processing is presented.
Identifying Incorrect Classifications with Balanced Uncertainty
TLDR
The distributional imbalance is proposed to model the imbalance in uncertainty estimation as two kinds of distribution biases, and the Balanced True Class Probability framework is proposed, which learns an uncertainty estimator with a novel Distributional Focal Loss (DFL) objective.
Confidence Aware Neural Networks for Skin Cancer Detection
TLDR
This work presents three different methods for quantifying uncertainties for skin cancer detection from images and comprehensively evaluates and compares performance of these DNNs using novel uncertainty-related metrics.
UncertaintyFuseNet: Robust Uncertainty-aware Hierarchical Feature Fusion with Ensemble Monte Carlo Dropout for COVID-19 Detection
TLDR
The obtained results prove the effectiveness of the proposed fusion for COVID-19 detection using CT scan and X-Ray datasets and the proposed $UncertaintyFuseNet model is significantly robust to noise and performs well with the previously unseen data.
Improving MC-Dropout Uncertainty Estimates with Calibration Error-based Optimization
TLDR
This study proposes two new loss functions by combining cross entropy with Expected Calibration Error (ECE) and Predictive Entropy (PE) and shows that the new proposed loss functions lead to having a calibrated MC-Dropout method.
A Review of Generative Adversarial Networks in Cancer Imaging: New Applications, New Solutions
TLDR
The potential of GANs to address a number of key challenges of cancer imaging, including data scarcity and imbalance, domain and dataset shifts, data access and privacy, data annotation and quantification, as well as cancer detection, tumour profiling and treatment planning are assessed.
...
...

References

SHOWING 1-10 OF 602 REFERENCES
The Power of Ensembles for Active Learning in Image Classification
TLDR
It is found that ensembles perform better and lead to more calibrated predictive uncertainties, which are the basis for many active learning algorithms, and Monte-Carlo Dropout uncertainties perform worse.
Uncertainty estimation in deep learning with application to spoken language assessment
TLDR
Prior Networks combine the advantages of ensemble and single-model approaches to estimating uncertainty and are evaluated on a range classification datasets, where they are shown to outperform baseline approaches on the task of detecting out-of-distribution inputs.
Deeply uncertain: comparing methods of uncertainty quantification in deep learning algorithms
TLDR
A comparison of methods for uncertainty quantification in deep learning algorithms in the context of a simple physical system is presented in terms of simulated experimental measurements of a single pendulum—a prototypical physical system for studying measurement and analysis techniques.
Uncertainty in Deep Learning
TLDR
This work develops tools to obtain practical uncertainty estimates in deep learning, casting recent deep learning tools as Bayesian models without changing either the models or the optimisation, and develops the theory for such tools.
A Systematic Comparison of Bayesian Deep Learning Robustness in Diabetic Retinopathy Tasks
TLDR
A new BDL benchmark with a diverse set of tasks, inspired by a real-world medical imaging application on diabetic retinopathy diagnosis, and a systematic comparison of well-tuned BDL techniques on the various tasks concludes that some current techniques which solve benchmarks such as UCI `overfit' their uncertainty to the dataset underperform on this benchmark.
Uncertainty Quantification in Deep Residual Neural Networks
TLDR
Stochastic depth is used to address the problem of uncertainty quantification in deep residual networks by using a regularization technique called stochastic depth and produces well-calibrated softmax probabilities with only minor changes to the network's structure.
Neural Network-Based Uncertainty Quantification: A Survey of Methodologies and Applications
TLDR
The purpose of this survey paper is to comprehensively study neural network-based methods for construction of prediction intervals to cover how PIs are constructed, optimized, and applied for decision-making in presence of uncertainties.
On the Effect of Inter-observer Variability for a Reliable Estimation of Uncertainty of Medical Image Segmentation
TLDR
The results highlight the negative effect of fusion methods applied in deep learning, and show that the learned observers' uncertainty can be combined with current standard Monte Carlo dropout Bayesian neural networks to characterize uncertainty of model's parameters.
Benchmarking Bayesian Deep Learning with Diabetic Retinopathy Diagnosis
TLDR
This work proposes a new Bayesian deep learning benchmark, inspired by a realworld medical imaging application on diabetic retinopathy diagnosis, and provides a comprehensive comparison of well-tuned BDL techniques on the benchmark, including Monte Carlo dropout, mean-field variational inference, ansemble of deep models, an ensemble of dropout models, as well as a deterministic (deep) model.
Predictive Uncertainty Quantification with Compound Density Networks
TLDR
This paper increases the mixture model's flexibility by replacing the fixed mixing weights by an adaptive, input-dependent distribution and introduces variational Bayesian inference to train compound density networks (CDNs), which yield better uncertainty estimates on out-of-distribution data and are more robust to adversarial examples than the previous approaches.
...
...