# Well-calibrated prediction intervals for regression problems

@article{Dewolf2021WellcalibratedPI, title={Well-calibrated prediction intervals for regression problems}, author={Nicolas Dewolf and Bernard De Baets and Willem Waegeman}, journal={ArXiv}, year={2021}, volume={abs/2107.00363} }

Over the last few decades, various methods have been proposed for estimating prediction intervals in regression settings, including Bayesian methods, ensemble methods, direct interval estimation methods and conformal prediction methods. An important issue is the calibration of these methods: the generated prediction intervals should have a predefined coverage level, without being overly conservative. In this work, we review the above four classes of methods from a conceptual and experimental… Expand

#### References

SHOWING 1-10 OF 97 REFERENCES

A comparison of some conformal quantile regression methods

- Mathematics
- 2019

We compare two recently proposed methods that combine ideas from conformal inference and quantile regression to produce locally adaptive and marginally valid prediction intervals under sample… Expand

Regression conformal prediction with random forests

- Mathematics, Computer Science
- Machine Learning
- 2014

In this study, the use of random forests as the underlying model for regression conformal prediction is investigated and compared to existing state-of-the-art techniques, which are based on neural networks and k-nearest neighbors. Expand

Nonparametric predictive distributions based on conformal prediction

- Mathematics, Computer Science
- Machine Learning
- 2018

This paper introduces and explores predictive distribution functions that always satisfy a natural property of validity in terms of guaranteed coverage for IID observations, and applies conformal prediction to derive predictive distributions that are valid under a nonparametric assumption. Expand

Conformalized Quantile Regression

- Computer Science, Mathematics
- NeurIPS
- 2019

This paper proposes a new method that is fully adaptive to heteroscedasticity, which combines conformal prediction with classical quantile regression, inheriting the advantages of both. Expand

Random Forest Prediction Intervals

- Computer Science
- The American Statistician
- 2019

This work proposes new random forest prediction intervals that are based on the empirical distribution of out-of-bag prediction errors and indicates that intervals constructed with the proposed method tend to be narrower than those of competing methods while still maintaining marginal coverage rates approximately equal to nominal levels. Expand

Distribution Calibration for Regression

- Computer Science, Mathematics
- ICML
- 2019

The novel concept of distribution calibration is introduced, and its advantages over the existing definition of quantile calibration are demonstrated, and a post-hoc approach to improving the predictions from previously trained regression models is proposed. Expand

Normalized nonconformity measures for regression Conformal Prediction

- Computer Science
- 2008

This paper applies Conformal Prediction to the k-Nearest Neighbours Regression algorithm and proposes a way of extending the typical nonconformity measure used for regression so far, which produces predictive regions of variable width depending on the expected accuracy of the algorithm on each example. Expand

Practical Confidence and Prediction Intervals

- Computer Science
- NIPS
- 1996

This work proposes a new method to compute prediction intervals that is better than existing methods with regard to extrapolation and interpolation in data regimes with a limited amount of data, and yields prediction intervals which actual confidence levels are closer to the desired confidence levels. Expand

Accelerating difficulty estimation for conformal regression forests

- Mathematics, Computer Science
- Annals of Mathematics and Artificial Intelligence
- 2017

A large-scale empirical evaluation is presented, showing that both the nearest-neighbor-based and the variance-based measures significantly outperform a standard (non-normalized) nonconformity measure, while no significant difference in efficiency between the two normalized approaches is observed. Expand

Quantile Regularization: Towards Implicit Calibration of Regression Models

- Computer Science, Mathematics
- ArXiv
- 2020

This work presents a method for calibrating regression models based on a novel quantile regularizer defined as the cumulative KL divergence between two CDFs, which significantly improves calibration for regression models trained using approaches, such as Dropout VI and Deep Ensembles. Expand