• Corpus ID: 42837927

Deep Active Learning over the Long Tail

@article{Geifman2017DeepAL,
  title={Deep Active Learning over the Long Tail},
  author={Yonatan Geifman and Ran El-Yaniv},
  journal={ArXiv},
  year={2017},
  volume={abs/1711.00941}
}
This paper is concerned with pool-based active learning for deep neural networks. Motivated by coreset dataset compression ideas, we present a novel active learning algorithm that queries consecutive points from the pool using farthest-first traversals in the space of neural activation over a representation layer. We show consistent and overwhelming improvement in sample complexity over passive learning (random sampling) for three datasets: MNIST, CIFAR-10, and CIFAR-100. In addition, our… 

Figures from this paper

Deep Active Learning with a Neural Architecture Search

TLDR
This work proposes a novel active strategy whereby the learning algorithm searches for effective architectures on the fly, while actively learning, and shows that the proposed approach overwhelmingly outperforms active learning using fixed architectures.

Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds

TLDR
This work designs a new algorithm for batch active learning with deep neural network models that samples groups of points that are disparate and high-magnitude when represented in a hallucinated gradient space, and shows that while other approaches sometimes succeed for particular batch sizes or architectures, BADGE consistently performs as well or better, making it a versatile option for practical active learning problems.

DIVERSE, UNCERTAIN GRADIENT LOWER BOUNDS

  • Computer Science
  • 2019
TLDR
The algorithm, Batch Active learning by Diverse Gradient Embeddings (BADGE), samples groups of points that are disparate and high-magnitude when represented in a hallucinated gradient space, a strategy designed to incorporate both predictive uncertainty and sample diversity into every selected batch.

Gone Fishing: Neural Active Learning with Fisher Embeddings

TLDR
This paper motivates and revisits a classic, Fisher-based active selection objective, and proposes BAIT, a practical, tractable, and high-performing algorithm that makes it viable for use with neural models.

Discriminative Active Learning

TLDR
Experimental results show the proposed batch mode active learning algorithm, Discriminative Active Learning, to be on par with state of the art methods in medium and large query batch sizes, while being simple to implement and also extend to other domains besides classification tasks.

Online Active Learning with Surrogate Loss Functions

We derive a novel active learning algorithm in the streaming setting for binary classification tasks. The algorithm leverages weak labels to minimize the number of label requests, and trains a model

Deep Active Learning: Unified and Principled Method for Query and Training

TLDR
A unified and principled method for both the querying and training processes in deep batch active learning is proposed, providing theoretical insights from the intuition of modeling the interactive procedure in active learning as distribution matching by adopting the Wasserstein distance.

Active Learning Through a Covering Lens

TLDR
This work proposes ProbCover – a new active learning algorithm for the low budget regime, which seeks to maximize Probability Coverage and describes a dual way to view the formulation, from which one can derive strategies suitable for the high budget regime of active learning, related to existing methods like Coreset.

ALLSH: Active Learning Guided by Local Sensitivity and Hardness

TLDR
This work proposes to retrieve unlabeled samples with a local sensitivity and hardness-aware acquisition function that generates data copies through local perturbations and selects data points whose predictive likelihoods diverge the most from their copies.

Diffusion-based Deep Active Learning

TLDR
This work proposes a versatile and efficient criterion that automatically switches from exploration to refinement when the distribution has been sufficiently mapped and relies on a process of diffusing the existing label information over a graph constructed from the hidden representation of the data set as provided by the neural network.
...

References

SHOWING 1-10 OF 23 REFERENCES

Coresets for Nonparametric Estimation - the Case of DP-Means

TLDR
This work shows the existence of coresets for DP-Means - a prototypical nonparametric clustering problem - and provides a practical construction algorithm that allows us to efficiently trade off computation time and approximation error and thus scale DP- means to large datasets.

Deep Bayesian Active Learning with Image Data

TLDR
This paper develops an active learning framework for high dimensional data, a task which has been extremely challenging so far, with very sparse existing literature, and demonstrates its active learning techniques with image data, obtaining a significant improvement on existing active learning approaches.

Efficient and Parsimonious Agnostic Active Learning

We develop a new active learning algorithm for the streaming setting satisfying three important properties: 1) It provably works for any classifier representation and classification problem including

Improving generalization with active learning

TLDR
A formalism for active concept learning called selective sampling is described and it is shown how it may be approximately implemented by a neural network.

Maximum Margin Coresets for Active and Noise Tolerant Learning

TLDR
This work provides a direct algorithm and analysis for constructing large margin coresets and shows various applications including a novel coreset based analysis of large margin active learning and a polynomial time algorithm for agnostic learning in the presence of outlier noise.

Cost-Effective Active Learning for Deep Image Classification

TLDR
This paper proposes a novel active learning (AL) framework, which is capable of building a competitive classifier with optimal feature representation via a limited amount of labeled training instances in an incremental learning manner and incorporates deep convolutional neural networks into AL.

Active Learning via Perfect Selective Classification

TLDR
A reduction of active learning to selective classification that preserves fast rates is shown and exponential target-independent label complexity speedup is derived for actively learning general (non-homogeneous) linear classifiers when the data distribution is an arbitrary high dimensional mixture of Gaussians.

Selective Classification for Deep Neural Networks

TLDR
A method to construct a selective classifier given a trained neural network, which allows a user to set a desired risk level and the classifier rejects instances as needed, to grant the desired risk (with high probability).

Online Choice of Active Learning Algorithms

TLDR
Taking an ensemble containing two of the best known active learning algorithms and a new algorithm, the resulting new active learning master algorithm is empirically shown to consistently perform almost as well as and sometimes outperform the best algorithm in the ensemble on a range of classification problems.

Very Deep Convolutional Networks for Large-Scale Image Recognition

TLDR
This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.