Corpus ID: 221761345

'Less Than One'-Shot Learning: Learning N Classes From M

@article{Sucholutsky2020LessTO,
  title={'Less Than One'-Shot Learning: Learning N Classes From M},
  author={Ilia Sucholutsky and Matthias Schonlau},
  journal={ArXiv},
  year={2020},
  volume={abs/2009.08449}
}
  • Ilia Sucholutsky, Matthias Schonlau
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • Deep neural networks require large training sets but suffer from high computational cost and long training times. Training on much smaller training sets while maintaining nearly the same accuracy would be very beneficial. In the few-shot learning setting, a model must learn a new class given only a small number of samples from that class. One-shot learning is an extreme form of few-shot learning where the model must learn a new class from a single example. We propose the `less than one'-shot… CONTINUE READING
    3 Citations

    Figures from this paper.

    Flexible Dataset Distillation: Learn Labels Instead of Images
    • 4
    • Highly Influenced
    • PDF
    Optimal 1-NN Prototypes for Pathological Geometries
    SecDD: Efficient and Secure Method for Remotely Training Neural Networks

    References

    SHOWING 1-10 OF 32 REFERENCES
    Matching Networks for One Shot Learning
    • 1,973
    • PDF
    Generalizing from a Few Examples: A Survey on Few-Shot Learning
    • 62
    Prototypical Networks for Few-shot Learning
    • 1,628
    • PDF
    Active Learning for Convolutional Neural Networks: A Core-Set Approach
    • 281
    • PDF
    One-shot learning of object categories
    • 1,849
    • PDF
    Pruning training sets for learning of object categories
    • 121
    • PDF
    Soft-Label Dataset Distillation and Text Dataset Distillation
    • 8
    • PDF
    Human-level concept learning through probabilistic program induction
    • 1,384
    • PDF
    Core Vector Machines: Fast SVM Training on Very Large Data Sets
    • 918
    • PDF
    Dataset Distillation
    • 46
    • PDF