A General Framework for Uncertainty Estimation in Deep Learning

@article{Loquercio2020AGF,
  title={A General Framework for Uncertainty Estimation in Deep Learning},
  author={Antonio Loquercio and Mattia Segu and Davide Scaramuzza},
  journal={IEEE Robotics and Automation Letters},
  year={2020},
  volume={5},
  pages={3153-3160}
}
Neural networks predictions are unreliable when the input sample is out of the training distribution or corrupted by noise. Being able to detect such failures automatically is fundamental to integrate deep learning algorithms into robotics. Current approaches for uncertainty estimation of neural networks require changes to the network and optimization process, typically ignore prior knowledge about the data, and tend to make over-simplifying assumptions which underestimate uncertainty. To… Expand
Trust Your Robots! Predictive Uncertainty Estimation of Neural Networks with Sparse Gaussian Processes
TLDR
It is argued that the practical and principled combination of DNNs with sparse Gaussian Processes can pave the way towards reliable and fast robot learning systems with uncertainty awareness. Expand
On the Practicality of Deterministic Epistemic Uncertainty
TLDR
It is found that, while DUMs scale to realistic vision tasks and perform well on OOD detection, the practicality of current methods is undermined by poor calibration under realistic distributional shifts. Expand
A Survey of Uncertainty in Deep Neural Networks
TLDR
A comprehensive introduction to the most crucial sources of uncertainty in neural networks is given and their separation into reducible model uncertainty and not reducible data uncertainty is presented. Expand
Robust Neural Regression via Uncertainty Learning
TLDR
This work proposes a simple solution by extending the time-tested iterative reweighted least square (IRLS) in generalised linear regression, using two sub-networks to parametrise the prediction and uncertainty estimation, enabling easy handling of complex inputs and nonlinear response. Expand
Sparsity Increases Uncertainty Estimation in Deep Ensemble
TLDR
It is empirically showed that the ensemble members’ disagreement increases with pruning, making models sparser by zeroing irrelevant parameters, which helps in making more robust predictions, and an energy-efficient compressed deep ensemble is appropriate for memory-intensive and uncertainty-aware tasks. Expand
Bayesian Deep Learning Hyperparameter Search for Robust Function Mapping to Polynomials with Noise
TLDR
This paper attempts to study the question that an appropriate neural architecture and ensemble configuration can be found to detect a signal of any n-th order polynomial contaminated with noise, and suggests the possible existence of an optimal network depth as well as an optimal number of ensembles for prediction skills and uncertainty quantification. Expand
Uncertainty Estimation for Data-Driven Visual Odometry
TLDR
This work proposes uncertainty-aware VO (UA-VO), a novel deep neural network (DNN) architecture that computes relative pose predictions by processing sequence of images and, at the same time, provides uncertainty measures about those estimations. Expand
Uncertainty Estimation for Deep Learning-Based Segmentation of Roads in Synthetic Aperture Radar Imagery
TLDR
A deep learning model for road extraction in SAR is created and used to compare standard model outputs against the most popular methods for uncertainty estimation, MCD and DE, finding that these methods are not effective as an indicator of segmentation quality when measuring uncertainty but are effective when uncertainty is measured from the set of road predictions only. Expand
Bayesian deep learning of affordances from RGB images
TLDR
A Bayesian deep learning method to predict the affordances available in the environment directly from RGB images based on a multiscale CNN that combines local and global information from the object and the full image. Expand
Uncertainty-Aware Self-Supervised Learning of Spatial Perception Tasks
TLDR
Quantitative results show that the general self-supervised learning approach for spatial perception tasks, such as estimating the pose of an object relative to the robot, from onboard sensor readings, works well and explicitly accounting for uncertainty yields statistically significant performance improvements. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 48 REFERENCES
Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles
TLDR
This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates. Expand
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
TLDR
A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. Expand
What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?
TLDR
A Bayesian deep learning framework combining input-dependent aleatoric uncertainty together with epistemic uncertainty is presented, which makes the loss more robust to noisy data, also giving new state-of-the-art results on segmentation and depth regression benchmarks. Expand
Concrete Dropout
TLDR
This work proposes a new dropout variant which gives improved performance and better calibrated uncertainties, and uses a continuous relaxation of dropout’s discrete masks to allow for automatic tuning of the dropout probability in large models, and as a result faster experimentation cycles. Expand
Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models
TLDR
This paper proposes a new algorithm called probabilistic ensembles with trajectory sampling (PETS) that combines uncertainty-aware deep network dynamics models with sampling-based uncertainty propagation, which matches the asymptotic performance of model-free algorithms on several challenging benchmark tasks, while requiring significantly fewer samples. Expand
Lightweight Probabilistic Deep Networks
  • Jochen Gast, S. Roth
  • Computer Science, Mathematics
  • 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
  • 2018
TLDR
This paper proposes probabilistic output layers for classification and regression that require only minimal changes to existing networks and shows that activation uncertainties can be propagated in a practical fashion through the entire network, again with minor changes. Expand
Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks
TLDR
This work presents a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP), which works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients. Expand
Assumed Density Filtering Methods for Learning Bayesian Neural Networks
TLDR
This paper rigorously compares the recently proposed assumed density filtering based methods for learning Bayesian neural networks – Expectation and Probabilistic backpropagation and develops several extensions, including a version of EBP for continuous regression problems and a PBP variant for binary classification. Expand
Self-Supervised Deep Reinforcement Learning with Generalized Computation Graphs for Robot Navigation
TLDR
A generalized computation graph is proposed that subsumes value-based model-free methods and model-based methods, and is instantiate to form a navigation model that learns from raw images and is sample efficient, and outperforms single-step and double-step double Q-learning. Expand
Risk versus Uncertainty in Deep Learning: Bayes, Bootstrap and the Dangers of Dropout
TLDR
The implication of these results is clear: combine deep learning with Bayesian inference for the best decisions from data. Expand
...
1
2
3
4
5
...