• Corpus ID: 248572298

Training-conditional coverage for distribution-free predictive inference

  title={Training-conditional coverage for distribution-free predictive inference},
  author={Michael Bian and Rina Foygel Barber},
The field of distribution-free predictive inference provides tools for provably valid prediction without any assumptions on the distribution of the data, which can be paired with any regression algorithm to provide accurate and reliable predictive intervals. The guarantees provided by these methods are typically marginal, meaning that predictive accuracy holds on average over both the training data set and the test point that is queried. However, it may be preferable to obtain a stronger… 

Figures from this paper



The limits of distribution-free conditional predictive inference

This work aims to explore the space in between exact conditional inference guarantees and what types of relaxations of the conditional coverage property would alleviate some of the practical concerns with marginal coverage guarantees while still being possible to achieve in a distribution-free setting.

Distribution-Free Predictive Inference for Regression

A general framework for distribution-free predictive inference in regression, using conformal inference, which allows for the construction of a prediction band for the response variable using any estimator of the regression function, and a model-free notion of variable importance, called leave-one-covariate-out or LOCO inference.

Adaptive, Distribution-Free Prediction Intervals for Deep Networks

A neural network is proposed that outputs three values instead of a single point estimate and optimizes a loss function motivated by the standard quantile regression loss and provides two prediction interval methods with finite sample coverage guarantees solely under the assumption that the observations are independent and identically distributed.

Discretized conformal prediction for efficient distribution‐free inference

In regression problems where there is no known true underlying model, conformal prediction methods enable prediction intervals to be constructed without any assumptions on the distribution of the

Distribution-Free Prediction Sets Adaptive to Unknown Covariate Shift

This paper proposes a novel flexible distribution-free method, PredSet-1Step, to construct prediction sets that can eficiently adapt to unknown covariate shift and shows that it achieves nominal coverage in a number of experiments and a data set concerning HIV risk prediction in a South African cohort study.

Predictive inference with the jackknife+

This paper introduces the jackknife+, which is a novel method for constructing predictive confidence intervals. Whereas the jackknife outputs an interval centered at the predicted response of a test

Distribution-Free, Risk-Controlling Prediction Sets

This work shows how to generate set-valued predictions from a black-box predictor that controls the expected loss on future test points at a user-specified level, and provides explicit finite-sample guarantees for any dataset by using a holdout set to calibrate the size of the prediction sets.

Finite-sample Efficient Conformal Prediction

This paper provides two general-purpose selection algorithms and considers coverage as well as width properties of the final prediction region of the conformal prediction regions given a family of machine learning algorithms.

Fast exact conformalization of the lasso using piecewise linear homotopy

This paper develops an exact and computationally efficient conformalization of the lasso and elastic net and provides a simpler and better-justified online lasso algorithm, which may be of independent interest.

Distribution‐free prediction bands for non‐parametric regression

A new prediction band is given by combining the idea of ‘conformal prediction’ with non‐parametric conditional density estimation and the proposed estimator, called COPS, always has a finite sample guarantee.