Commonsense Knowledge Base Completion with Structural and Semantic Context
@inproceedings{Malaviya2020CommonsenseKB, title={Commonsense Knowledge Base Completion with Structural and Semantic Context}, author={Chaitanya Malaviya and Chandra Bhagavatula and Antoine Bosselut and Yejin Choi}, booktitle={AAAI}, year={2020} }
Automatic KB completion for commonsense knowledge graphs (e.g., ATOMIC and ConceptNet) poses unique challenges compared to the much studied conventional knowledge bases (e.g., Freebase). Commonsense knowledge graphs use free-form text to represent nodes, resulting in orders of magnitude more nodes compared to conventional KBs ( ∼18x more nodes in ATOMIC compared to Freebase (FB15K-237)). Importantly, this implies significantly sparser graph structures — a major challenge for existing KB…
Figures and Tables from this paper
61 Citations
Learning from Missing Relations: Contrastive Learning with Commonsense Knowledge Graphs for Commonsense Inference
- Computer ScienceFINDINGS
- 2022
This paper focuses on addressing missing relations in commonsense knowledge graphs, and proposes a novel contrastive learning framework called SOLAR, which outperforms the state-of-the-art commonsense transformer on commonsense inference with ConceptNet by 1.84%.
StATIK: Structure and Text for Inductive Knowledge Graph Completion
- Computer ScienceNAACL-HLT
- 2022
StATIK uses Language Models to extract the semantic information from text descriptions, while using Message Passing Neural Networks to capture the structural information and achieves state of the art results on three challenging inductive baselines.
Effective use of BERT in graph embeddings for sparse knowledge graph completion
- Computer ScienceSAC
- 2022
A BERT-based method (BERT-ConvE), to exploit transfer learning of BERT in combination with a convolutional network model ConvE, which is suitable for sparse graphs as also demonstrated by empirical studies on ATOMIC and sparsified-FB15k-237 datasets.
Robust Knowledge Graph Completion with Stacked Convolutions and a Student Re-Ranking Network
- Computer ScienceACL
- 2021
A deep convolutional network is developed that utilizes textual entity representations and it is demonstrated that this model outperforms recent KG completion methods in this challenging setting and its performance improvements stem primarily from its robustness to sparsity.
Neural-Symbolic Commonsense Reasoner with Relation Predictors
- Computer ScienceACL
- 2021
A neural-symbolic reasoner, capable of reasoning over large-scale dynamic CKGs, and the logic rules for reasoning over CKGs are learned during training by the model, which helps to generalise prediction to newly introduced events.
Inductive Learning on Commonsense Knowledge Graph Completion
- Computer Science2021 International Joint Conference on Neural Networks (IJCNN)
- 2021
A novel learning framework named InductivE is developed that densifies CKGs by adding edges among semantic-related entities and provide more supportive information for unseen entities, leading to better generalization ability of entity embedding forseen entities.
DualTKB: A Dual Learning Bridge between Text and Knowledge Base
- Computer ScienceEMNLP
- 2020
This work investigates the impact of weak supervision by creating a weakly supervised dataset and shows that even a slight amount of supervision can significantly improve the model performance and enable better-quality transfers.
Object-Action Association Extraction from Knowledge Graphs
- Computer ScienceSEMANTiCS
- 2021
This paper proposes a novel method for extracting and evaluating relations between objects and actions from knowledge graphs, such as ConceptNet and WordNet, taking into consideration semantic similarity methods.
Benchmarking Commonsense Knowledge Base Population with an Effective Evaluation Dataset
- Computer ScienceEMNLP
- 2021
Reasoning over commonsense knowledge bases (CSKB) whose elements are in the form of free-text is an important yet hard task in NLP. While CSKB completion only fills the missing links within the…
Semantic Categorization of Social Knowledge for Commonsense Question Answering
- Computer ScienceSUSTAINLP
- 2021
This work proposed to categorize the semantics needed for these QA tasks using the SocialIQA as an example, and further train neural QA models to incorporate such social knowledge categories and relation information from a knowledge base.
References
SHOWING 1-10 OF 32 REFERENCES
Convolutional 2D Knowledge Graph Embeddings
- Computer ScienceAAAI
- 2018
ConvE, a multi-layer convolutional network model for link prediction, is introduced, and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across all datasets.
Translating Embeddings for Modeling Multi-relational Data
- Computer ScienceNIPS
- 2013
TransE is proposed, a method which models relationships by interpreting them as translations operating on the low-dimensional embeddings of the entities, which proves to be powerful since extensive experiments show that TransE significantly outperforms state-of-the-art methods in link prediction on two knowledge bases.
COMET: Commonsense Transformers for Automatic Knowledge Graph Construction
- Computer ScienceACL
- 2019
This investigation reveals promising results when implicit knowledge from deep pre-trained language models is transferred to generate explicit knowledge in commonsense knowledge graphs, and suggests that using generative commonsense models for automatic commonsense KB completion could soon be a plausible alternative to extractive methods.
Language Models as Knowledge Bases?
- Computer ScienceEMNLP
- 2019
An in-depth analysis of the relational knowledge already present (without fine-tuning) in a wide range of state-of-the-art pretrained language models finds that BERT contains relational knowledge competitive with traditional NLP methods that have some access to oracle knowledge.
End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion
- Computer ScienceAAAI
- 2019
This work proposes a novel end-to-end Structure-Aware Convolutional Network (SACN) that takes the benefit of GCN and ConvE together, and demonstrates the effectiveness of the proposed SACN on standard FB15k-237 and WN18RR datasets.
Commonsense mining as knowledge base completion? A study on the impact of novelty
- Computer ScienceArXiv
- 2018
It is shown that a simple baseline method that outperforms the previous state of the art on predicting more novel triples and proposes novelty of predicted triples with respect to the training set as an important factor in interpreting results.
ConceptNet 5: A Large Semantic Network for Relational Knowledge
- Computer ScienceThe People's Web Meets NLP
- 2013
The latest iteration of ConceptNet is presented, ConceptNet 5, with a focus on its fundamental design decisions and ways to interoperate with it.
Modeling Relational Data with Graph Convolutional Networks
- Computer ScienceESWC
- 2018
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
Commonsense Knowledge Base Completion
- Computer ScienceACL
- 2016
This work develops neural network models for scoring tuples on arbitrary phrases and evaluates them by their ability to distinguish true held-out tuples from false ones and finds strong performance from a bilinear model using a simple additive architecture to model phrases.
Commonsense Knowledge Mining from Pretrained Models
- Computer ScienceEMNLP
- 2019
This work develops a method for generating commonsense knowledge using a large, pre-trained bidirectional language model that can be used to rank a triple’s validity by the estimated pointwise mutual information between the two entities.