COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs
@inproceedings{Hwang2021COMETATOMIC2O, title={COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs}, author={Jena D. Hwang and Chandra Bhagavatula and Ronan Le Bras and Jeff Da and Keisuke Sakaguchi and Antoine Bosselut and Yejin Choi}, booktitle={AAAI}, year={2021} }
Recent years have brought about a renewed interest in commonsense representation and reasoning in the field of natural language understanding. The development of new commonsense knowledge graphs (CSKG) has been central to these advances as their diverse facts can be used and referenced by machine learning models for tackling new and challenging tasks. At the same time, there remain questions about the quality and coverage of these resources due to the massive scale required to comprehensively…
Figures and Tables from this paper
78 Citations
Benchmarking Commonsense Knowledge Base Population with an Effective Evaluation Dataset
- Computer ScienceEMNLP
- 2021
Reasoning over commonsense knowledge bases (CSKB) whose elements are in the form of free-text is an important yet hard task in NLP. While CSKB completion only fills the missing links within the…
Analyzing Commonsense Emergence in Few-shot Knowledge Models
- Computer ScienceAKBC
- 2021
The results show that commonsense knowledge models can rapidly adapt from limited examples, indicating that KG fine-tuning serves to learn an interface to encoded knowledge learned during pretraining.
Understanding Few-Shot Commonsense Knowledge Models
- Computer ScienceArXiv
- 2021
This work investigates training commonsense knowledge models in a fewshot setting with limited tuples per commonsense relation in the graph and finds that human quality ratings for knowledge produced from a few-shot trained system can achieve performance within 6% of knowledgeproduced from fully supervised systems.
Commonsense Knowledge in Word Associations and ConceptNet
- Computer ScienceCONLL
- 2021
An in-depth comparison of two large-scale resources of general knowledge: ConceptNet, an engineered relational database, and SWOW, a knowledge graph derived from crowd-sourced word associations shows empirically that both resources improve downstream task performance on commonsense reasoning benchmarks over text-only baselines.
Commonsense Reasoning: \protect \@normalcr how do Neuro-Symbolic and Neuro-only approaches compare?
- Computer Science
- 2021
This paper sets out to compare a Neuro-Symbolic model with mainstream Neuro-only models when they are tasked with solving commonsense reasoning problems, and indicates that there is no clear advantage to either approach.
Commonsense Reasoning: How do Neuro-Symbolic and Neuro-only Approaches Compare?
- Computer ScienceCIKM Workshops
- 2021
This paper sets out to compare a Neuro-Symbolic model with mainstream Neuro-only models when they are tasked with solving commonsense reasoning problems, and indicates that there is no clear advantage to either approach.
Improving Unsupervised Commonsense Reasoning Using Knowledge-Enabled Natural Language Inference
- Computer ScienceEMNLP
- 2021
This work shows the effectiveness of using a common framework, Natural Language Inference (NLI), to solve diverse commonsense reasoning tasks, by leveraging transfer learning from large NLI datasets, and injecting crucial knowledge from commonsense sources such as ATOMIC 2020 and ConceptNet.
Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference
- Computer ScienceFINDINGS
- 2022
Commonsense inference poses a unique challenge to reason and generate the physical, social, and causal conditions of a given event. Existing approaches to commonsense inference utilize commonsense…
GreaseLM: Graph REASoning Enhanced Language Models for Question Answering
- Computer ScienceArXiv
- 2022
This work proposes GREASELM, a new model that fuses encoded representations from pretrained LMs and graph neural networks over multiple layers of modality interaction operations, allowing language context representations to be grounded by structured world knowledge, and allowing linguistic nuances in the context to inform the graph representations of knowledge.
Shortcutted Commonsense: Data Spuriousness in Deep Learning of Commonsense Reasoning
- Computer ScienceEMNLP
- 2021
A study on different prominent benchmarks that involve commonsense reasoning, along a number of key stress experiments, thus seeking to gain insight on whether the models are learning transferable generalizations intrinsic to the problem at stake or just taking advantage of incidental shortcuts in the data items.
References
SHOWING 1-10 OF 46 REFERENCES
COMET: Commonsense Transformers for Automatic Knowledge Graph Construction
- Computer ScienceACL
- 2019
This investigation reveals promising results when implicit knowledge from deep pre-trained language models is transferred to generate explicit knowledge in commonsense knowledge graphs, and suggests that using generative commonsense models for automatic commonsense KB completion could soon be a plausible alternative to extractive methods.
KagNet: Knowledge-Aware Graph Networks for Commonsense Reasoning
- Computer ScienceEMNLP
- 2019
This paper proposes a textual inference framework for answering commonsense questions, which effectively utilizes external, structured commonsense knowledge graphs to perform explainable inferences.
TransOMCS: From Linguistic Graphs to Commonsense Knowledge
- Computer ScienceArXiv
- 2020
Experimental results demonstrate the transferability of linguistic knowledge to commonsense knowledge and the effectiveness of the proposed approach in terms of quantity, novelty, and quality.
Commonsense Knowledge Mining from Pretrained Models
- Computer ScienceEMNLP
- 2019
This work develops a method for generating commonsense knowledge using a large, pre-trained bidirectional language model that can be used to rank a triple’s validity by the estimated pointwise mutual information between the two entities.
Language Models as Knowledge Bases?
- Computer ScienceEMNLP
- 2019
An in-depth analysis of the relational knowledge already present (without fine-tuning) in a wide range of state-of-the-art pretrained language models finds that BERT contains relational knowledge competitive with traditional NLP methods that have some access to oracle knowledge.
ATOMIC: An Atlas of Machine Commonsense for If-Then Reasoning
- Computer ScienceAAAI
- 2019
Experimental results demonstrate that multitask models that incorporate the hierarchical structure of if-then relation types lead to more accurate inference compared to models trained in isolation, as measured by both automatic and human evaluation.
Commonsense Knowledge Base Completion
- Computer ScienceACL
- 2016
This work develops neural network models for scoring tuples on arbitrary phrases and evaluates them by their ability to distinguish true held-out tuples from false ones and finds strong performance from a bilinear model using a simple additive architecture to model phrases.
Pre-training Is (Almost) All You Need: An Application to Commonsense Reasoning
- Computer ScienceACL
- 2020
This paper introduces a new scoring method that casts a plausibility ranking task in a full-text format and leverages the masked language modeling head tuned during the pre-training phase and requires less annotated data than the standard classifier approach to reach equivalent performances.
PIQA: Reasoning about Physical Commonsense in Natural Language
- Computer ScienceAAAI
- 2020
The task of physical commonsense reasoning and a corresponding benchmark dataset Physical Interaction: Question Answering or PIQA are introduced and analysis about the dimensions of knowledge that existing models lack are provided, which offers significant opportunities for future research.
Commonsense Evidence Generation and Injection in Reading Comprehension
- Computer ScienceSIGDIAL
- 2020
A Commonsense Evidence Generation and Injection framework in reading comprehension, named CEGI, which injects two kinds of auxiliary commonsense evidence into comprehensive reading to equip the machine with the ability of rational thinking.