• Corpus ID: 235694131

On the Practicality of Deterministic Epistemic Uncertainty

@inproceedings{Postels2022OnTP,
  title={On the Practicality of Deterministic Epistemic Uncertainty},
  author={Janis Postels and Mattia Segu and Tao Sun and Luc Van Gool and Fisher Yu and Federico Tombari},
  booktitle={ICML},
  year={2022}
}
A set of novel approaches for estimating epistemic uncertainty in deep neural networks with a single forward pass has recently emerged as a valid alternative to Bayesian Neural Networks. On the premise of informative representations, these deterministic uncertainty methods (DUMs) achieve strong performance on detecting out-of-distribution (OOD) data while adding negligible computational costs at inference time. However, it remains unclear whether DUMs are well calibrated and can seamlessly… 

Simple Direct Uncertainty Quantification Technique Based on Machine Learning Regression

TLDR
A technique that allows Bayesian dropout uncertainty to be estimated using learned regression models is proposed and it is found that, once trained, this allows dropout Uncertainty to be effectively and efficiently predicted.

Latent Discriminant deterministic Uncertainty

TLDR
This work advances a scalable and effective DUM for high-resolution semantic segmentation, that relaxes the Lipschitz constraint typically hindering practicality of such architectures.

Dense Uncertainty Estimation via an Ensemble-based Conditional Latent Variable Model

TLDR
This work argues that the internal variable in a conditional latent variable model is another source of epistemic uncertainty to model the predictive distribution and explore the limited knowledge about the hidden true model in aleatoric uncertainty estimation.

Using Explainable AI to Measure Feature Contribution to Uncertainty

TLDR
This work focuses on applying existing XAI techniques to deep neural networks to understand how features contribute to epistemic uncertainty, a measure of confidence in a prediction given the training data distribution upon which the neural network was trained.

Better Uncertainty Quantification for Machine Translation Evaluation

TLDR
This paper trains the C OMET metric with new heteroscedastic regression, divergence minimization, and direct uncertainty prediction objectives and demonstrates the ability of the predictors to identify low quality references and to reveal model uncertainty due to out-of-domain data.

SHIFT: A Synthetic Driving Dataset for Continuous Multi-Task Domain Adaptation

TLDR
This paper introduces the largest multi-task synthetic dataset for autonomous driving, SHIFT, which presents discrete and continuous shifts in cloudiness, rain and fog intensity, time of day, and vehicle and pedestrian density.

References

SHOWING 1-10 OF 86 REFERENCES

A General Framework for Uncertainty Estimation in Deep Learning

TLDR
This work proposes a novel framework for uncertainty estimation of neural networks, based on Bayesian belief networks and Monte-Carlo sampling, which outperform previous methods by up to 23% in accuracy and has several desirable properties.

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts

TLDR
The Posterior Network (PostNet) is proposed, which uses Normalizing Flows to predict an individual closed-form posterior distribution over predicted probabilites for any input sample, and achieves state-of-the art results in OOD detection and in uncertainty calibration under dataset shifts.

Bayesian Uncertainty Estimation for Batch Normalized Deep Networks

TLDR
It is shown that training a deep network using batch normalization is equivalent to approximate inference in Bayesian models, and it is demonstrated how this finding allows us to make useful estimates of the model uncertainty.

Evaluating Scalable Bayesian Deep Learning Methods for Robust Computer Vision

TLDR
This work proposes a comprehensive evaluation framework for scalable epistemic uncertainty estimation methods in deep learning and applies this framework to provide the first properly extensive and conclusive comparison of the two current state-of-the- art scalable methods: ensembling and MC-dropout.

What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?

TLDR
A Bayesian deep learning framework combining input-dependent aleatoric uncertainty together with epistemic uncertainty is presented, which makes the loss more robust to noisy data, also giving new state-of-the-art results on segmentation and depth regression benchmarks.

Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles

TLDR
This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates.

Simple and Principled Uncertainty Estimation with Deterministic Deep Learning via Distance Awareness

TLDR
Spectral-normalized Neural Gaussian Process (SNGP), a simple method that improves the distance-awareness ability of modern DNNs, by adding a weight normalization step during training and replacing the output layer with a Gaussian process and outperforms the other single-model approaches.

Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty

TLDR
It is shown that a single softmax neural net with minimal changes can beat the uncertainty predictions of Deep Ensembles and other more complex single-forward-pass uncertainty approaches and it is necessary to combine this density with the softmax entropy to disentangle aleatoric and epistemic uncertainty—crucial e.g. for active learning.

Natural Posterior Network: Deep Bayesian Predictive Uncertainty for Exponential Family Distributions

TLDR
The Natural Posterior Network is proposed for fast and highquality uncertainty estimation for any task where the target distribution belongs to the exponential family, and it leverages Normalizing Flows to fit a single density on a learned low-dimensional and taskdependent latent space.

Can You Trust Your Model's Uncertainty? Evaluating Predictive Uncertainty Under Dataset Shift

TLDR
A large-scale benchmark of existing state-of-the-art methods on classification problems and the effect of dataset shift on accuracy and calibration is presented, finding that traditional post-hoc calibration does indeed fall short, as do several other previous methods.
...