Corpus ID: 215415893

Efficient long-distance relation extraction with DG-SpanBERT

@article{Chen2020EfficientLR,
  title={Efficient long-distance relation extraction with DG-SpanBERT},
  author={Jun Chen and R. Hoehndorf and Mohamed Elhoseiny and Xiangliang Zhang},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.03636}
}
In natural language processing, relation extraction seeks to rationally understand unstructured text. Here, we propose a novel SpanBERT-based graph convolutional network (DG-SpanBERT) that extracts semantic features from a raw sentence using the pre-trained language model SpanBERT and a graph convolutional network to pool latent features. Our DG-SpanBERT model inherits the advantage of SpanBERT on learning rich lexical features from large-scale corpus. It also has the ability to capture long… Expand

Figures and Tables from this paper

GDPNet: Refining Latent Multi-View Graph for Relation Extraction
TLDR
This paper proposes to construct a latent multi-view graph to capture various possible relationships among tokens and refined this graph to select important words for relation prediction, and shows that GDPNet achieves the best performance on dialogue-level RE, and comparable performance with the state-of-the-arts on sentence- level RE. Expand
Structured Prediction as Translation between Augmented Natural Languages
We propose a new framework, Translation between Augmented Natural Languages (TANL), to solve many structured prediction language tasks including joint entity and relation extraction, nested namedExpand
Re-TACRED: Addressing Shortcomings of the TACRED Dataset
TLDR
Re-TACRED, a new completely re-annotated version of the TACRED dataset that can be used to perform reliable evaluation of relation extraction models, is released. Expand
Zero-Shot Information Extraction as a Unified Text-to-Triple Translation
TLDR
A suite of information extraction tasks are cast into a text-to-triple translation framework that transfers non-trivially to most tasks and is often competitive with a fully supervised method without the need for any task-specific training. Expand

References

SHOWING 1-10 OF 22 REFERENCES
Graph Convolution over Pruned Dependency Trees Improves Relation Extraction
TLDR
An extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel is proposed, and a novel pruning strategy is applied to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. Expand
Cross-Sentence N-ary Relation Extraction with Graph LSTMs
TLDR
A general relation extraction framework based on graph long short-term memory networks (graph LSTMs) that can be easily extended to cross-sentence n-ary relation extraction is explored, demonstrating its effectiveness with both conventional supervised learning and distant supervision. Expand
Enriching Pre-trained Language Model with Entity Information for Relation Classification
TLDR
This paper proposes a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task and achieves significant improvement over the state-of-the-art method on the SemEval-2010 task 8 relational dataset. Expand
Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification
TLDR
The experimental results on the SemEval-2010 relation classification task show that the AttBLSTM method outperforms most of the existing methods, with only word vectors. Expand
Matching the Blanks: Distributional Similarity for Relation Learning
TLDR
This paper builds on extensions of Harris’ distributional hypothesis to relations, as well as recent advances in learning text representations (specifically, BERT), to build task agnostic relation representations solely from entity-linked text. Expand
A Dependency-Based Neural Network for Relation Classification
TLDR
A new structure, termed augmented dependency path (ADP), is proposed, which is composed of the shortest dependency path between two entities and the subtrees attached to the shortest path, and a dependency-based neural networks (DepNN) are developed to exploit the semantic representation behind the ADP. Expand
Relation Classification via Multi-Level Attention CNNs
TLDR
A novel convolutional neural network architecture is proposed that enables endto-end learning from task-specific labeled data, forgoing the need for external knowledge such as explicit dependency structures and outperforms previous state-of-the-art methods. Expand
Position-aware Attention and Supervised Data Improve Slot Filling
TLDR
An effective new model is proposed, which combines an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction that builds TACRED, a large supervised relation extraction dataset obtained via crowdsourcing and targeted towards TAC KBP relations. Expand
Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths
TLDR
This paper presents SDP-LSTM, a novel neural network to classify the relation of two entities in a sentence, which leverages the shortest dependency path (SDP) between two entities; multichannel recurrent neural networks, with long short term memory (L STM) units, pick up heterogeneous information along the SDP. Expand
Simple BERT Models for Relation Extraction and Semantic Role Labeling
TLDR
This work is the first to successfully apply BERT in this manner for relation extraction and semantic role labeling, and its models provide strong baselines for future research. Expand
...
1
2
3
...