Unsupervised Commonsense Question Answering with Self-Talk

@article{Shwartz2020UnsupervisedCQ,
  title={Unsupervised Commonsense Question Answering with Self-Talk},
  author={Vered Shwartz and Peter West and Ronan Le Bras and Chandra Bhagavatula and Yejin Choi},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.05483}
}
Natural language understanding involves reading between the lines with implicit background knowledge. Current systems either rely on pre-trained language models as the sole implicit source of world knowledge, or resort to external knowledge bases (KBs) to incorporate additional relevant knowledge. We propose an unsupervised framework based on \emph{self-talk} as a novel alternative to multiple-choice commonsense tasks. Inspired by inquiry-based discovery learning (Bruner, 1961), our approach… Expand
34 Citations
Commonsense Reasoning for Natural Language Processing
  • 9
  • PDF
Self-Supervised Knowledge Triplet Learning for Zero-shot Question Answering
  • 8
  • Highly Influenced
  • PDF
Knowledge-driven Self-supervision for Zero-shot Commonsense Question Answering
  • 3
  • Highly Influenced
  • PDF
Thinking Aloud: Dynamic Context Generation Improves Zero-Shot Reasoning Performance of GPT-2
  • 2
  • Highly Influenced
  • PDF
Learning to Rationalize for Nonmonotonic Reasoning with Distant Supervision
  • 3
  • PDF
Dimensions of Commonsense Knowledge
  • 2
  • PDF
Document-Level Event Argument Extraction by Conditional Generation
  • PDF
Enriching a Model's Notion of Belief using a Persistent Memory
  • PDF
Flexible Operations for Natural Language Deduction
  • PDF
...
1
2
3
4
...

References

SHOWING 1-10 OF 76 REFERENCES
Dynamic Knowledge Graph Construction for Zero-shot Commonsense Question Answering
  • 19
  • PDF
Language Models as Knowledge Bases?
  • 284
  • PDF
Commonsense for Generative Multi-Hop Question Answering Tasks
  • 81
  • PDF
CommonsenseQA: A Question Answering Challenge Targeting Commonsense Knowledge
  • 194
  • Highly Influential
  • PDF
Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning
  • 13
  • Highly Influential
  • PDF
Language Models are Unsupervised Multitask Learners
  • 2,891
  • Highly Influential
  • PDF
Commonsense Knowledge Mining from Pretrained Models
  • 61
  • Highly Influential
  • PDF
Explain Yourself! Leveraging Language Models for Commonsense Reasoning
  • 106
  • PDF
Improving Language Understanding by Generative Pre-Training
  • 2,005
  • Highly Influential
  • PDF
...
1
2
3
4
5
...