Corpus ID: 235694507

Well-calibrated prediction intervals for regression problems

  title={Well-calibrated prediction intervals for regression problems},
  author={Nicolas Dewolf and Bernard De Baets and Willem Waegeman},
Over the last few decades, various methods have been proposed for estimating prediction intervals in regression settings, including Bayesian methods, ensemble methods, direct interval estimation methods and conformal prediction methods. An important issue is the calibration of these methods: the generated prediction intervals should have a predefined coverage level, without being overly conservative. In this work, we review the above four classes of methods from a conceptual and experimental… Expand

Figures and Tables from this paper


A comparison of some conformal quantile regression methods
We compare two recently proposed methods that combine ideas from conformal inference and quantile regression to produce locally adaptive and marginally valid prediction intervals under sampleExpand
Regression conformal prediction with random forests
In this study, the use of random forests as the underlying model for regression conformal prediction is investigated and compared to existing state-of-the-art techniques, which are based on neural networks and k-nearest neighbors. Expand
Nonparametric predictive distributions based on conformal prediction
This paper introduces and explores predictive distribution functions that always satisfy a natural property of validity in terms of guaranteed coverage for IID observations, and applies conformal prediction to derive predictive distributions that are valid under a nonparametric assumption. Expand
Conformalized Quantile Regression
This paper proposes a new method that is fully adaptive to heteroscedasticity, which combines conformal prediction with classical quantile regression, inheriting the advantages of both. Expand
Random Forest Prediction Intervals
This work proposes new random forest prediction intervals that are based on the empirical distribution of out-of-bag prediction errors and indicates that intervals constructed with the proposed method tend to be narrower than those of competing methods while still maintaining marginal coverage rates approximately equal to nominal levels. Expand
Distribution Calibration for Regression
The novel concept of distribution calibration is introduced, and its advantages over the existing definition of quantile calibration are demonstrated, and a post-hoc approach to improving the predictions from previously trained regression models is proposed. Expand
Normalized nonconformity measures for regression Conformal Prediction
This paper applies Conformal Prediction to the k-Nearest Neighbours Regression algorithm and proposes a way of extending the typical nonconformity measure used for regression so far, which produces predictive regions of variable width depending on the expected accuracy of the algorithm on each example. Expand
Practical Confidence and Prediction Intervals
This work proposes a new method to compute prediction intervals that is better than existing methods with regard to extrapolation and interpolation in data regimes with a limited amount of data, and yields prediction intervals which actual confidence levels are closer to the desired confidence levels. Expand
Accelerating difficulty estimation for conformal regression forests
A large-scale empirical evaluation is presented, showing that both the nearest-neighbor-based and the variance-based measures significantly outperform a standard (non-normalized) nonconformity measure, while no significant difference in efficiency between the two normalized approaches is observed. Expand
Quantile Regularization: Towards Implicit Calibration of Regression Models
This work presents a method for calibrating regression models based on a novel quantile regularizer defined as the cumulative KL divergence between two CDFs, which significantly improves calibration for regression models trained using approaches, such as Dropout VI and Deep Ensembles. Expand