• Corpus ID: 235166576

Optimized conformal classification using gradient descent approximation

  title={Optimized conformal classification using gradient descent approximation},
  author={Anthony Bellotti},
Conformal predictors are an important class of algorithms that allow predictions to be made with a user-defined confidence level. They are able to do this by outputting prediction sets, rather than simple point predictions. The conformal predictor is valid in the sense that the accuracy of its predictions is guaranteed to meet the confidence level, only assuming exchangeability in the data. Since accuracy is guaranteed, the performance of a conformal predictor is measured through the efficiency… 

Figures and Tables from this paper

Training Uncertainty-Aware Classifiers with Conformalized Deep Learning

A novel training algorithm is developed that can lead to smaller conformal prediction sets with higher conditional coverage, after exact calibration with hold-out data, compared to state-of-the-art alternatives.

Learning Optimal Conformal Classifiers

Algorithm B Python Pseudo-Code for ConfTr: We present code based on our Python and Jax implementation of ConfTr. In particular, we include smooth calibration and prediction steps for T HR as well as

On the Utility of Prediction Sets in Human-AI Teams

D-CP, a method to perform CP on some examples and defer to experts, is introduced and it is proved that D-CP can reduce the prediction set size of non-deferred examples, which leads to unhelpful AI assistants.


Algorithm B Python Pseudo-Code for ConfTr: We present code based on our Python and Jax implementation of ConfTr. In particular, we include smooth calibration and prediction steps for T HR as well as

Conformal inference is (almost) free for neural networks trained with early stopping

A novel method that combines early stopping with conformal calibration while recycling the same hold-out data leads to models that are both accurate and able to provide exact predictive inferences without multiple data splits nor overly conservative adjustments.



Training conformal predictors

The empirical results suggest that conformal Predictors trained by minimizing their observed fuzziness perform better than conformal predictors trained in the traditional way by minimizing the prediction error of the corresponding point classifier.

Constructing normalized nonconformity measures based on maximizing predictive efficiency

Experiments are reported that show that it results in conformal predictors that provide improved predictive efficiency for regression problems on several data sets, whilst remaining reliable, and that the optimal parameter values typically differ for different confidence levels.

High-Quality Prediction Intervals for Deep Learning: A Distribution-Free, Ensembled Approach

This paper derives a loss function directly from this axiom that requires no distributional assumption, and shows how its form derives from a likelihood principle, that it can be used with gradient descent, and that model uncertainty is accounted for in ensembled form.

Algorithmic Learning in a Random World

A selection of books about type systems in programming languages, information theory, and machine learning that takes the randomness of the world into account, and verification of real time systems.

Criteria of Efficiency for Conformal Prediction

It turns out that the most standard criteria of efficiency used in literature on conformal prediction are not probabilistic, so optimal conformity measures for various criteria ofefficiency in an idealised setting are studied.

Biomedical applications: Diagnostics and prognostics

  • Conformal Predictions for Reliable Machine Learning: Theory, Adaptations and Applications. Elsevier,
  • 2014