Learning Confidence for Out-of-Distribution Detection in Neural Networks

@article{DeVries2018LearningCF,
  title={Learning Confidence for Out-of-Distribution Detection in Neural Networks},
  author={Terrance DeVries and Graham W. Taylor},
  journal={CoRR},
  year={2018},
  volume={abs/1802.04865}
}
Modern neural networks are very powerful predictive models, but they are often incapable of recognizing when their predictions may be wrong. Closely related to this is the task of out-ofdistribution detection, where a network must determine whether or not an input is outside of the set on which it is expected to safely perform. To jointly address these issues, we propose a method of learning confidence estimates for neural networks that is simple to implement and produces intuitively… CONTINUE READING
Recent Discussions
This paper has been referenced on Twitter 64 times over the past 90 days. VIEW TWEETS

From This Paper

Topics from this paper.
5 Citations
19 References
Similar Papers

Similar Papers

Loading similar papers…