Discretized conformal prediction for efficient distribution‐free inference

  title={Discretized conformal prediction for efficient distribution‐free inference},
  author={Wenyu Chen and Kelli-Jean Chun and Rina Foygel Barber},
In regression problems where there is no known true underlying model, conformal prediction methods enable prediction intervals to be constructed without any assumptions on the distribution of the underlying data, except that the training and test data are assumed to be exchangeable. However, these methods bear a heavy computational cost—and, to be carried out exactly, the regression algorithm would need to be fitted infinitely many times. In practice, the conformal prediction method is run by… 

Root-finding approaches for computing conformal prediction set

This work exploits the fact that conformal prediction sets are intervals whose boundaries can be efficiently approximated by classical root-finding algorithms and investigates how this approach can overcome many limitations of formerly used strategies.

Conformalized Quantile Regression

This paper proposes a new method that is fully adaptive to heteroscedasticity, which combines conformal prediction with classical quantile regression, inheriting the advantages of both.

Computing Full Conformal Prediction Set with Approximate Homotopy

This work proposes efficient algorithms to compute conformal prediction set using approximated solution of (convex) regularized empirical risk minimization using a new homotopy continuation technique for tracking the solution path with respect to sequential changes of the observations.

Conformal Bayesian Computation

Using ‘add-one-in’ importance sampling, it is shown that conformal Bayesian predictive intervals are efficiently obtained from re-weighted posterior samples of model parameters.

Exchangeability, Conformal Prediction, and Rank Tests

The main message of the paper is to show that similar to conformal prediction, rank tests can also be used as a wrapper around any dimension reduction algorithm.

Training-conditional coverage for distribution-free predictive inference

This work examines the training-conditional coverage properties of several distribution-free predictive inference methods and concludes that training-Conditional coverage is achieved by some methods but is impossible to guarantee without further assumptions for others.

Conformal predictive distributions: an approach to nonparametric fiducial prediction

The subject of this chapter is conformal predictive distributions, which represent the only approach to nonparametric fiducial prediction that is available at this time. It starts from reviewing

Distribution-free conditional median inference

A method is proposed based upon ideas from conformal prediction and a theoretical guarantee of coverage is established while also going over particular distributions where its performance is sharp, resulting in a lower bound on the length of any possible conditional median confidence interval.

Conditional predictive inference for high-dimensional stable algorithms

These results show that despite the serious problems of resampling procedures for inference on the unknown parameters, leave-one-out methods can be successfully applied to obtain reliable predictive inference even in high dimensions.

Conditional predictive inference for stable algorithms

These results show that despite the serious problems of resampling procedures for inference on the unknown parameters, cross validation methods can be successfully applied to obtain reliable predictive inference even in high dimensions and conditionally on the training data.



Efficiency of conformalized ridge regression

The degree to which this additional requirement of efficiency is satisfied in the case of Bayesian ridge regression is explored; it is found that asymptotically conformal prediction sets differ little from ridge regression prediction intervals when the standard Bayesian assumptions are satisfied.

Distribution-Free Predictive Inference for Regression

A general framework for distribution-free predictive inference in regression, using conformal inference, which allows for the construction of a prediction band for the response variable using any estimator of the regression function, and a model-free notion of variable importance, called leave-one-covariate-out or LOCO inference.

Fast exact conformalization of the lasso using piecewise linear homotopy

This paper develops an exact and computationally efficient conformalization of the lasso and elastic net and provides a simpler and better-justified online lasso algorithm, which may be of independent interest.

On-line predictive linear regression

This work states a general result showing that in the on-line protocol the frequency of error for the classical prediction intervals does equal the nominal significance level, up to statistical fluctuations.

Machine-Learning Applications of Algorithmic Randomness


Trimmed Conformal Prediction for High-Dimensional Models

This paper proposes a new framework, called Trimmed Conformal Prediction (TCP), based on two stage procedure, a trimming step and a prediction step, which can be applied to any regression method, and further offers both statistical accuracy and computational gains.

Algorithmic Learning in a Random World

A selection of books about type systems in programming languages, information theory, and machine learning that takes the randomness of the world into account, and verification of real time systems.