On the Limits of Learning to Actively Learn Semantic Representations

@article{Koshorek2019OnTL,
  title={On the Limits of Learning to Actively Learn Semantic Representations},
  author={Omri Koshorek and Gabriel Stanovsky and Yi-chu Zhou and Vivek Srikumar and Jonathan Berant},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.02228}
}
One of the goals of natural language understanding is to develop models that map sentences into meaning representations. However, training such models requires expensive annotation of complex structures, which hinders their adoption. Learning to actively-learn (LTAL) is a recent paradigm for reducing the amount of labeled data by learning a policy that selects which samples should be labeled. In this work, we examine LTAL for learning semantic representations, such as QA-SRL. We show that even… CONTINUE READING

Figures, Tables, and Topics from this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 34 REFERENCES

Massively Multilingual Word Embeddings

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Inorganic Materials Synthesis Planning with Literature-Trained Neural Networks

  • Journal of chemical information and modeling
  • 2019
VIEW 1 EXCERPT