A Simple Neural Attentive Meta-learner

@inproceedings{2017ASN,
  title={A Simple Neural Attentive Meta-learner},
  author={},
  year={2017}
}
  • Published 2017
Deep neural networks excel in regimes with large amounts of data, but tend to struggle when data is scarce or when they need to adapt quickly to changes in the task. In response, recent work in meta-learning proposes training a meta-learner on a distribution of similar tasks, in the hopes of generalization to novel but related tasks by learning a high-level strategy that captures the essence of the problem it is asked to solve. However, many recent meta-learning approaches are extensively hand… CONTINUE READING
Highly Cited
This paper has 33 citations. REVIEW CITATIONS

Topics

Statistics

020406020172018
Citations per Year

Citation Velocity: 22

Averaging 22 citations per year over the last 2 years.

Learn more about how we calculate this metric in our FAQ.