How Knowledge Graph and Attention Help? A Qualitative Analysis into Bag-level Relation Extraction

@inproceedings{Hu2021HowKG,
  title={How Knowledge Graph and Attention Help? A Qualitative Analysis into Bag-level Relation Extraction},
  author={Zikun Hu and Yixin Cao and Lifu Huang and Tat-Seng Chua},
  booktitle={ACL/IJCNLP},
  year={2021}
}
Knowledge Graph (KG) and attention mechanism have been demonstrated effective in introducing and selecting useful information for weakly supervised methods. However, only qualitative analysis and ablation study are provided as evidence. In this paper, we contribute a dataset and propose a paradigm to quantitatively evaluate the effect of attention and KG on bag-level relation extraction (RE). We find that (1) higher attention accuracy may lead to worse performance as it may harm the model’s… Expand

Figures and Tables from this paper

Are Missing Links Predictable? An Inferential Benchmark for Knowledge Graph Completion
  • Yixin Cao, Xiang Ji, Xin Lv, Juan-Zi Li, Yonggang Wen, Hanwang Zhang
  • Computer Science
  • ACL/IJCNLP
  • 2021
TLDR
This work presents InferWiki, a Knowledge Graph Completion (KGC) dataset that improves upon existing benchmarks in inferential ability, assumptions, and patterns, and includes various inference patterns (e.g., reasoning path length and types) for comprehensive evaluation. Expand

References

SHOWING 1-10 OF 22 REFERENCES
Neural Knowledge Acquisition via Mutual Attention Between Knowledge Graph and Text
TLDR
A general joint representation learning framework for knowledge acquisition on two tasks, knowledge graph completion and relation extraction from text, where models trained under this joint framework are significantly improved in comparison with other baselines. Expand
Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction
TLDR
A brand-new light-weight neural framework to address the distantly supervised relation extraction problem and alleviate the defects in previous selective attention framework is proposed, which achieves a new state-of-the-art performance in terms of both AUC and top-n precision metrics. Expand
Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention
TLDR
Two novel word attention models for distantly- supervised relation extraction are proposed: a Bi-directional Gated Recurrent Unit based word attention model (Bi-GRU) and a combination model which combines multiple complementary models using weighted voting method for improved relation extraction. Expand
Neural Relation Extraction with Selective Attention over Instances
TLDR
A sentence-level attention-based model for relation extraction that employs convolutional neural networks to embed the semantics of sentences and dynamically reduce the weights of those noisy instances. Expand
Long-tail Relation Extraction via Knowledge Graph Embeddings and Graph Convolution Networks
TLDR
This work proposes to leverage implicit relational knowledge among class labels from knowledge graph embeddings and learn explicit relational knowledge using graph convolution networks and integrates that relational knowledge into relation extraction model by coarse-to-fine knowledge-aware attention mechanism. Expand
Improving Distantly-Supervised Relation Extraction with Joint Label Embedding
TLDR
A novel multi-layer attention-based model to improve relation extraction with joint label embedding that makes full use of both structural information from Knowledge Graphs and textual information from entity descriptions to learn label embeddings through gating integration while avoiding the imposed noise with an attention mechanism. Expand
Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
TLDR
This paper proposes a sentence-level attention model to select the valid instances, which makes full use of the supervision information from knowledge bases, and extracts entity descriptions from Freebase and Wikipedia pages to supplement background knowledge for the authors' task. Expand
Distant Supervision Relation Extraction with Intra-Bag and Inter-Bag Attentions
TLDR
This paper presents a neural relation extraction method to deal with the noisy training data generated by distant supervision and achieves better relation extraction accuracy than state-of-the-art methods on this dataset. Expand
Relation Extraction with Explanation
TLDR
This work annotates a test set with ground-truth sentence-level explanations to evaluate the quality of explanations afforded by the relation extraction models and demonstrates that replacing the entity mentions in the sentences with their fine-grained entity types not only enhances extraction accuracy but also improves explanation. Expand
Modeling Relations and Their Mentions without Labeled Text
TLDR
A novel approach to distant supervision that can alleviate the problem of noisy patterns that hurt precision by using a factor graph and applying constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in the authors' training KB. Expand
...
1
2
3
...