• Corpus ID: 3526391

Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks

@article{Liang2018EnhancingTR,
  title={Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks},
  author={Shiyu Liang and Yixuan Li and Rayadurgam Srikant},
  journal={arXiv: Learning},
  year={2018}
}
We consider the problem of detecting out-of-distribution images in neural networks. [...] Key Method Our method is based on the observation that using temperature scaling and adding small perturbations to the input can separate the softmax score distributions between in- and out-of-distribution images, allowing for more effective detection. We show in a series of experiments that ODIN is compatible with diverse network architectures and datasets. It consistently outperforms the baseline approach by a large…Expand
Generalized ODIN: Detecting Out-of-Distribution Image Without Learning From Out-of-Distribution Data
TLDR
This work bases its work on a popular method ODIN, proposing two strategies for freeing it from the needs of tuning with OoD data, while improving its OoD detection performance, and proposing to decompose confidence scoring as well as a modified input pre-processing method.
Convolutional Neural Networks with Compression Complexity Pooling for Out-of-Distribution Image Detection
TLDR
This work proposes a novel out-of-distribution detection method termed as MALCOM, which neither uses any out ofdistribution sample nor retrains the model, and introduces a similarity metric that focuses on shared patterns between two sequences based on the normalized compression distance.
Enhancing the Robustness of Prior Network in Out-of-Distribution Detection
TLDR
A perturbed prior network architecture is proposed, which can efficiently separate model- level uncertainty from data-level uncertainty via prior entropy viaPrior entropy, and a concentration perturbation algorithm, which adaptively adds noise to concentration parameters so that the in- and out-of-distribution images are better separable.
Unsupervised out-of-distribution detection using kernel density estimation
TLDR
An unsupervised OOD detection method that can work with both classification and non-classification networks by using kernel density estimation (KDE) is proposed and achieves competitive results to the state-of-the-art in classification networks and leads to improvement on segmentation network.
Exploring the Limits of Out-of-Distribution Detection
TLDR
It is demonstrated that large-scale pre-trained transformers can significantly improve the state-of-the-art (SOTA) on a range of near OOD tasks across different data modalities, and a new way of using just the names of outlier classes as a sole source of information without any accompanying images is explored.
Exploring the Limits of Out-of-Distribution Detection
Near out-of-distribution detection (OOD) is a major challenge for deep neural networks. We demonstrate that large-scale pre-trained transformers can significantly improve the state-of-the-art (SOTA)
An Efficient Data Augmentation Network for Out-of-Distribution Image Detection
TLDR
This paper proposes an efficient data augmentation network to detect out-of-distribution image data by introducing a set of common geometric operations into training and testing images by combining predicted probabilities of the augmented data into an aggregation function to provide a confidence score to distinguish between in-dist Distribution and out- of-dist distribution image data.
Class-wise Thresholding for Detecting Out-of-Distribution Data
TLDR
The problem of detecting Out-ofDistribution input data when using deep neural networks is considered, and a class-wise thresholding scheme is proposed that can apply to most existing OoD detection algorithms and can maintain similar OoD Detection performance even in the presence of label shift.
OUT-OF-DISTRIBUTION DETECTION USING DEEP NEURAL NETWORKS
  • 2019
Deep neural networks have achieved great success in classification tasks during the last years. However, one major problem to the path towards artificial intelligence is the inability of neural
Why Out-of-distribution Detection in CNNs Does Not Like Mahalanobis - and What to Use Instead
TLDR
It is demonstrated that nonparametric LOF-based confidence estimation can improve current Mahalanobis-based SOTA or obtain similar performance in a simpler way.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 49 REFERENCES
Fully Convolutional Neural Network for Fast Anomaly Detection in Crowded Scenes
TLDR
An FCN-based architecture for anomaly detection and localization in crowded scenes videos is proposed, which includes two main components, one for feature representation, and one for cascaded out-layer detection.
Very Deep Convolutional Networks for Large-Scale Image Recognition
TLDR
This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
TLDR
A simple baseline that utilizes probabilities from softmax distributions is presented, showing the effectiveness of this baseline across all computer vision, natural language processing, and automatic speech recognition, and it is shown the baseline can sometimes be surpassed.
LSUN: Construction of a Large-scale Image Dataset using Deep Learning with Humans in the Loop
TLDR
This work proposes to amplify human effort through a partially automated labeling scheme, leveraging deep learning with humans in the loop, and constructs a new image dataset, LSUN, which contains around one million labeled images for each of 10 scene categories and 20 object categories.
ImageNet classification with deep convolutional neural networks
TLDR
A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Deep Residual Learning for Image Recognition
TLDR
This work presents a residual learning framework to ease the training of networks that are substantially deeper than those used previously, and provides comprehensive empirical evidence showing that these residual networks are easier to optimize, and can gain accuracy from considerably increased depth.
Deep neural networks are easily fooled: High confidence predictions for unrecognizable images
TLDR
This work takes convolutional neural networks trained to perform well on either the ImageNet or MNIST datasets and finds images with evolutionary algorithms or gradient ascent that DNNs label with high confidence as belonging to each dataset class, and produces fooling images, which are then used to raise questions about the generality of DNN computer vision.
On Calibration of Modern Neural Networks
TLDR
It is discovered that modern neural networks, unlike those from a decade ago, are poorly calibrated, and on most datasets, temperature scaling -- a single-parameter variant of Platt Scaling -- is surprisingly effective at calibrating predictions.
Learning Multiple Layers of Features from Tiny Images
TLDR
It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network.
Intriguing properties of neural networks
TLDR
It is found that there is no distinction between individual highlevel units and random linear combinations of high level units, according to various methods of unit analysis, and it is suggested that it is the space, rather than the individual units, that contains of the semantic information in the high layers of neural networks.
...
1
2
3
4
5
...