GDPNet: Refining Latent Multi-View Graph for Relation Extraction

@inproceedings{Xue2021GDPNetRL,
  title={GDPNet: Refining Latent Multi-View Graph for Relation Extraction},
  author={Fuzhao Xue and Aixin Sun and Hao Zhang and Eng Siong Chng},
  booktitle={AAAI},
  year={2021}
}
Relation Extraction (RE) is to predict the relation type of two entities that are mentioned in a piece of text, e.g., a sentence or a dialogue. When the given text is long, it is challenging to identify indicative words for the relation prediction. Recent advances on RE task are from BERT-based sequence modeling and graph-based modeling of relationships among the tokens in the sequence. In this paper, we propose to construct a latent multi-view graph to capture various possible relationships… 

Figures and Tables from this paper

Modeling Multi-Granularity Hierarchical Features for Relation Extraction
TLDR
Experimental results show that the proposed novel method to extract multi-granularity features based solely on the original input sentences outperforms existing state-of-the-art models that even use external knowledge.
Position Enhanced Mention Graph Attention Network for Dialogue Relation Extraction
TLDR
Dialogue Relation Extraction (DRE) is a new kind of relation extraction task from multi-turn dialogues where speaker specific relations are implicitly mixed together in both a local utterance window and a speaker context.
D-REX: Dialogue Relation Extraction with Explanations
TLDR
This work proposes a model-agnostic framework, D-REX, a policy-guided semi-supervised algorithm that optimizes for explanation quality and relation extraction simultaneously, and frames relation extraction as a re-ranking task and includes relation- and entity-specific explanations as an intermediate step of the inference process.
Graph Based Network with Contextualized Representations of Turns in Dialogue
TLDR
This paper proposes the TUrn COntext awaRE Graph Convolutional Network (TUCORE-GCN) modeled by paying attention to the way people understand dialogues, and proposes a novel approach which treats the task of emotion recognition in conversations (ERC) as a dialogue-based RE.
GRAPHCACHE: Message Passing as Caching for Sentence-Level Relation Extraction
TLDR
Inspired by the classical caching technique in computer systems, G RAPH C ACHE is developed to update the property representations in an online manner and yields effectiveness gains on RE and enables efficient message passing across all sentences in the dataset.
Consistent Inference for Dialogue Relation Extraction
TLDR
This paper designs mask mechanisms to refine utteranceaware and speaker-aware representations respectively from the global dialogue representation for the utterance distinction, and proposes a gate mechanism to aggregate such bi-grained representations.
None Class Ranking Loss for Document-Level Relation Extraction
TLDR
A new multi-label loss is proposed that encourages large margins of label confidence scores between each pre-de fined class and the none class, which enables captured label correlations and context-dependent thresholding for label prediction and robustness against positive-negative imbalance and mislabeled data that could appear in real-world RE datasets.
TREND: Trigger-Enhanced Relation-Extraction Network for Dialogues
TLDR
A multi-tasking BERT-based model which learns to identify triggers for improving relation extraction and achieves the state-of-theart on the benchmark datasets is proposed.
Biographical: A Semi-Supervised Relation Extraction Dataset
TLDR
Biographical, the first semi-supervised dataset for RE, is developed and demonstrated the effectiveness of the dataset by training a state-of-the-art neural model to classify relation pairs, and evaluating it on a manually annotated gold standard set.
KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction
TLDR
A Knowledge-aware Prompt-tuning approach with synergistic optimization (KnowPrompt) that injects latent knowledge contained in relation labels into prompt construction with learnable virtual type words and answer words.
...
1
2
3
...

References

SHOWING 1-10 OF 41 REFERENCES
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
TLDR
An extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel is proposed, and a novel pruning strategy is applied to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold.
Enriching Pre-trained Language Model with Entity Information for Relation Classification
TLDR
This paper proposes a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task and achieves significant improvement over the state-of-the-art method on the SemEval-2010 task 8 relational dataset.
Attention Guided Graph Convolutional Networks for Relation Extraction
TLDR
Attention Guided Graph Convolutional Networks is proposed, a novel model which directly takes full dependency trees as inputs and can be understood as a soft-pruning approach that automatically learns how to selectively attend to the relevant sub-structures useful for the relation extraction task.
Relation Extraction with Convolutional Network over Learnable Syntax-Transport Graph
TLDR
This work learns to transform the dependency tree into a weighted graph by considering the syntax dependencies of the connected nodes and persisting the structure of the original dependency tree, and refers to this graph as a syntax-transport graph.
A Walk-based Model on Entity Graphs for Relation Extraction
TLDR
A novel graph-based neural network model for relation extraction that treats multiple pairs in a sentence simultaneously and considers interactions among them and achieves performance comparable to the state-of-the-art systems on the ACE 2005 dataset without using any external tools.
Encoding Sentences with Graph Convolutional Networks for Semantic Role Labeling
TLDR
A version of graph convolutional networks (GCNs), a recent class of neural networks operating on graphs, suited to model syntactic dependency graphs, is proposed, observing that GCN layers are complementary to LSTM ones.
Learning Latent Forests for Medical Relation Extraction
TLDR
This work proposes a novel model which treats the dependency structure as a latent variable and induces it from the unstructured text in an end-to-end fashion and is able to significantly outperform state-of-the-art systems without relying on any direct tree supervision or pre-training.
Connecting the Dots: Document-level Neural Relation Extraction with Edge-oriented Graphs
TLDR
This work proposes an edge-oriented graph neural model that utilises different types of nodes and edges to create a document-level graph and enables to learn intra- and inter-sentence relations using multi-instance learning internally.
Position-aware Attention and Supervised Data Improve Slot Filling
TLDR
An effective new model is proposed, which combines an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction that builds TACRED, a large supervised relation extraction dataset obtained via crowdsourcing and targeted towards TAC KBP relations.
Learning to Represent Knowledge Graphs with Gaussian Embedding
TLDR
The experimental results demonstrate that the KG2E method can effectively model the (un)certainties of entities and relations in a KG, and it significantly outperforms state-of-the-art methods (including TransH and TransR).
...
1
2
3
4
5
...