Deep Contextualized Word Representations
- Matthew E. Peters, Mark Neumann, Luke Zettlemoyer
- Computer ScienceNorth American Chapter of the Association for…
- 15 February 2018
A new type of deep contextualized word representation is introduced that models both complex characteristics of word use and how these uses vary across linguistic contexts, allowing downstream models to mix different types of semi-supervision signals.
QuAC: Question Answering in Context
- Eunsol Choi, He He, Luke Zettlemoyer
- Computer ScienceConference on Empirical Methods in Natural…
- 21 August 2018
QuAC introduces challenges not found in existing machine comprehension datasets: its questions are often more open-ended, unanswerable, or only meaningful within the dialog context, as it shows in a detailed qualitative evaluation.
Deep Unordered Composition Rivals Syntactic Methods for Text Classification
- Mohit Iyyer, Varun Manjunatha, Jordan L. Boyd-Graber, Hal Daumé
- Computer ScienceAnnual Meeting of the Association for…
- 1 July 2015
This work presents a simple deep neural network that competes with and, in some cases, outperforms such models on sentiment analysis and factoid question answering tasks while taking only a fraction of the training time.
Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
- A. Kumar, Ozan Irsoy, R. Socher
- Computer ScienceInternational Conference on Machine Learning
- 24 June 2015
The dynamic memory network (DMN), a neural network architecture which processes input sequences and questions, forms episodic memories, and generates relevant answers, is introduced.
Adversarial Example Generation with Syntactically Controlled Paraphrase Networks
- Mohit Iyyer, J. Wieting, Kevin Gimpel, Luke Zettlemoyer
- Computer ScienceNorth American Chapter of the Association for…
- 17 April 2018
A combination of automated and human evaluations show that SCPNs generate paraphrases that follow their target specifications without decreasing paraphrase quality when compared to baseline (uncontrolled) paraphrase systems.
Search-based Neural Structured Learning for Sequential Question Answering
- Mohit Iyyer, Wen-tau Yih, Ming-Wei Chang
- Computer ScienceAnnual Meeting of the Association for…
- 8 May 2017
This work proposes a novel dynamic neural semantic parsing framework trained using a weakly supervised reward-guided search that effectively leverages the sequential context to outperform state-of-the-art QA systems that are designed to answer highly complex questions.
Reformulating Unsupervised Style Transfer as Paraphrase Generation
- Kalpesh Krishna, J. Wieting, Mohit Iyyer
- Computer ScienceConference on Empirical Methods in Natural…
- 12 October 2020
This paper reformulates unsupervised style transfer as a paraphrase generation problem, and presents a simple methodology based on fine-tuning pretrained language models on automatically generated paraphrase data that significantly outperforms state-of-the-art style transfer systems on both human and automatic evaluations.
Pathologies of Neural Models Make Interpretations Difficult
- Shi Feng, Eric Wallace, Alvin Grissom II, Mohit Iyyer, Pedro Rodriguez, Jordan L. Boyd-Graber
- Computer ScienceConference on Empirical Methods in Natural…
- 20 April 2018
This work uses input reduction, which iteratively removes the least important word from the input, to expose pathological behaviors of neural models: the remaining words appear nonsensical to humans and are not the ones determined as important by interpretation methods.
Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Auto-Encoders
- Andrew Drozdov, Pat Verga, Mohit Yadav, Mohit Iyyer, A. McCallum
- Computer ScienceNorth American Chapter of the Association for…
- 3 April 2019
DIORA is introduced, a fully-unsupervised method for discovering syntax that simultaneously learns representations for constituents within the induced tree that outperforms previously reported results for unsupervised binary constituency parsing on the benchmark WSJ dataset.
Exploring and Predicting Transferability across NLP Tasks
- Tu Vu, Tong Wang, Mohit Iyyer
- Computer ScienceConference on Empirical Methods in Natural…
- 2 May 2020
The results show that transfer learning is more beneficial than previously thought, especially when target task data is scarce, and can improve performance even when the source task is small or differs substantially from the target task.
...
...