The Aleatoric Uncertainty Estimation Using a Separate Formulation with Virtual Residuals

@article{Kawashima2020TheAU,
  title={The Aleatoric Uncertainty Estimation Using a Separate Formulation with Virtual Residuals},
  author={Takumi Kawashima and Qing Yu and Akari Asai and Daiki Ikami and Kiyoharu Aizawa},
  journal={2020 25th International Conference on Pattern Recognition (ICPR)},
  year={2020},
  pages={1438-1445}
}
We propose a new optimization framework for aleatoric uncertainty estimation in regression problems. Existing methods can quantify the error in the target estimation, but they tend to underestimate it. To obtain the predictive uncertainty inherent in an observation, we propose a new separable formulation for the estimation of a signal and of its uncertainty, avoiding the effect of overfitting. By decoupling target estimation and uncertainty estimation, we also control the balance between signal… 
1 Citations

A Survey of Uncertainty in Deep Neural Networks

A comprehensive introduction to the most crucial sources of uncertainty in neural networks is given and their separation into reducible model uncertainty and not reducible data uncertainty is presented.

References

SHOWING 1-10 OF 26 REFERENCES

Multi-Task Learning based on Separable Formulation of Depth Estimation and its Uncertainty

This work formulates regression with uncertainty estimation as a multi-task learning problem and a new uncertainty loss function, inspired by variational representations of robust estimation, and presents an optimization framework for uncertainty estimation in a regression problem.

Sampling-Free Epistemic Uncertainty Estimation Using Approximated Variance Propagation

This work presents a sampling-free approach for computing the epistemic uncertainty of a neural network and applies this approach to large-scale visual tasks to demonstrate the advantages of the method compared to sampling-based approaches in terms of quality of the uncertainty estimates as well as of computational overhead.

Leveraging Heteroscedastic Aleatoric Uncertainties for Robust Real-Time LiDAR 3D Object Detection

A robust real-time LiDAR 3D object detector that leverages heteroscedastic aleatoric uncertainties to significantly improve its detection performance and surpasses the baseline method which does not explicitly estimate uncertainties by up to nearly 9% in terms of Average Precision.

Uncertainty Estimates for Optical Flow with Multi-Hypotheses Networks

A new network architecture is introduced that enforces complementary hypotheses and provides uncertainty estimates efficiently within a single forward pass without the need for sampling or ensembles and demonstrates high-quality uncertainty estimates that clearly improve over previous confidence measures on optical flow.

ProbFlow: Joint Optical Flow and Uncertainty Estimation

A method that jointly predicts optical flow and its underlying uncertainty and derives a variational inference scheme based on mean field, which incorporates best practices from energy minimization and demonstrates the flexibility of the probabilistic approach by applying it to two different energies and on two benchmarks.

What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?

A Bayesian deep learning framework combining input-dependent aleatoric uncertainty together with epistemic uncertainty is presented, which makes the loss more robust to noisy data, also giving new state-of-the-art results on segmentation and depth regression benchmarks.

Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles

This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates.

Bayesian Uncertainty Estimation for Batch Normalized Deep Networks

It is shown that training a deep network using batch normalization is equivalent to approximate inference in Bayesian models, and it is demonstrated how this finding allows us to make useful estimates of the model uncertainty.

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.

Sparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image

  • Fangchang MaS. Karaman
  • Computer Science
    2018 IEEE International Conference on Robotics and Automation (ICRA)
  • 2018
The use of a single deep regression network to learn directly from the RGB-D raw data is proposed, and the impact of number of depth samples on prediction accuracy is explored, to attain a higher level of robustness and accuracy.