# Learn-By-Calibrating: Using Calibration As A Training Objective

@article{Thiagarajan2020LearnByCalibratingUC, title={Learn-By-Calibrating: Using Calibration As A Training Objective}, author={Jayaraman J. Thiagarajan and Bindya Venkatesh and Deepta Rajan}, journal={ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)}, year={2020}, pages={3632-3636} }

Calibration error is commonly adopted for evaluating the quality of uncertainty estimators in deep neural networks. In this paper, we argue that such a metric is highly beneficial for training predictive models, even when we do not explicitly measure the uncertainties. This is conceptually similar to heteroscedastic neural networks that produce variance estimates for each prediction, with the key difference that we do not place a Gaussian prior on the predictions. We propose a novel algorithm…

## 3 Citations

Training Calibration-based Counterfactual Explainers for Deep Learning Models in Medical Image Analysis

- Computer Science
- 2021

TraCE (Training Calibration-based Explainers), a counterfactual generation approach for deep models in medical image analysis, which utilizes pre-trained generative models and a novel uncertainty-based interval calibration strategy for synthesizing hypothesis-driven explanations, is presented.

Calibrating Healthcare AI: Towards Reliable and Interpretable Deep Predictive Models

- Computer Science, MathematicsArXiv
- 2020

This paper argues that these two objectives of characterizing model reliability and enabling rigorous introspection of model behavior are not necessarily disparate and proposes to utilize prediction calibration to meet both objectives.

Designing accurate emulators for scientific processes using calibration-driven deep models

- Mathematics, Computer ScienceNature communications
- 2020

This work proposes Learn-by-Calibrating, a novel deep learning approach based on interval calibration for designing emulators that can effectively recover the inherent noise structure without any explicit priors, and demonstrates the efficacy of this approach in providing high-quality emulators, when compared to widely-adopted loss function choices, even in small-data regimes.

## References

SHOWING 1-10 OF 15 REFERENCES

Accurate Uncertainties for Deep Learning Using Calibrated Regression

- Computer Science, MathematicsICML
- 2018

This work proposes a simple procedure for calibrating any regression algorithm, and finds that it consistently outputs well-calibrated credible intervals while improving performance on time series forecasting and model-based reinforcement learning tasks.

Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles

- Mathematics, Computer ScienceNIPS
- 2017

This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates.

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Mathematics, Computer ScienceICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy.

Uncertainties in Parameters Estimated with Neural Networks: Application to Strong Gravitational Lensing

- Physics
- 2017

In Hezaveh et al. 2017 we showed that deep learning can be used for model parameter estimation and trained convolutional neural networks to determine the parameters of strong gravitational lensing…

Concrete Dropout

- Computer Science, MathematicsNIPS
- 2017

This work proposes a new dropout variant which gives improved performance and better calibrated uncertainties, and uses a continuous relaxation of dropout’s discrete masks to allow for automatic tuning of the dropout probability in large models, and as a result faster experimentation cycles.

What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?

- Computer ScienceNIPS
- 2017

A Bayesian deep learning framework combining input-dependent aleatoric uncertainty together with epistemic uncertainty is presented, which makes the loss more robust to noisy data, also giving new state-of-the-art results on segmentation and depth regression benchmarks.

Practical Confidence and Prediction Intervals

- Computer ScienceNIPS
- 1996

This work proposes a new method to compute prediction intervals that is better than existing methods with regard to extrapolation and interpolation in data regimes with a limited amount of data, and yields prediction intervals which actual confidence levels are closer to the desired confidence levels.

Leveraging uncertainty information from deep neural networks for disease detection

- Computer Science, MedicineScientific Reports
- 2017

Drop-out based Bayesian uncertainty measures for DL in diagnosing diabetic retinopathy (DR) from fundus images are evaluated and it is shown that it captures uncertainty better than straightforward alternatives and that uncertainty informed decision referral can improve diagnostic performance.

Uncertainty in Deep Learning

- Computer Science
- 2016

This work develops tools to obtain practical uncertainty estimates in deep learning, casting recent deep learning tools as Bayesian models without changing either the models or the optimisation, and develops the theory for such tools.

Greedy function approximation: A gradient boosting machine.

- Mathematics
- 2001

Function estimation/approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions…