Cracking the Contextual Commonsense Code: Understanding Commonsense Reasoning Aptitude of Deep Contextual Representations

@article{Da2019CrackingTC,
  title={Cracking the Contextual Commonsense Code: Understanding Commonsense Reasoning Aptitude of Deep Contextual Representations},
  author={Jeff Da and Jungo Kasai},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.01157}
}
Pretrained deep contextual representations have advanced the state-of-the-art on various commonsense NLP tasks, but we lack a concrete understanding of the capability of these models. Thus, we investigate and challenge several aspects of BERT's commonsense representation abilities. First, we probe BERT's ability to classify various object attributes, demonstrating that BERT shows a strong ability in encoding various commonsense features in its embedding space, but is still deficient in many… Expand
10 Citations
Revisiting the Prepositional-Phrase Attachment Problem Using Explicit Commonsense Knowledge
  • PDF
Measuring and Improving Consistency in Pretrained Language Models
  • PDF
SemGloVe: Semantic Co-occurrences for GloVe from BERT
  • PDF
Understanding in Artificial Intelligence
  • PDF
Probing Neural Language Models for Human Tacit Assumptions
  • 4
  • PDF

References

SHOWING 1-10 OF 33 REFERENCES
Linguistic Knowledge and Transferability of Contextual Representations
  • 249
  • PDF
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
  • 15,580
  • Highly Influential
  • PDF
SWAG: A Large-Scale Adversarial Dataset for Grounded Commonsense Inference
  • 289
  • PDF
ERNIE: Enhanced Language Representation with Informative Entities
  • 249
  • PDF
A Corpus and Cloze Evaluation for Deeper Understanding of Commonsense Stories
  • 215
  • PDF
Do Neural Language Representations Learn Physical Commonsense?
  • 24
  • PDF
ATOMIC: An Atlas of Machine Commonsense for If-Then Reasoning
  • 189
  • PDF
SciBERT: Pretrained Contextualized Embeddings for Scientific Text
  • 141
  • PDF
Deep contextualized word representations
  • 5,396
  • PDF
...
1
2
3
4
...