Supervised Neural Models Revitalize the Open Relation Extraction

@article{Jia2018SupervisedNM,
  title={Supervised Neural Models Revitalize the Open Relation Extraction},
  author={Shengbin Jia and Yang Xiang and Xiaojun Chen},
  journal={ArXiv},
  year={2018},
  volume={abs/1809.09408}
}
Open relation extraction (ORE) remains a challenge to obtain a semantic representation by discovering arbitrary relation tuples from the un-structured text. However, perhaps due to limited data, previous extractors use unsupervised or semi-supervised methods based on pattern matching, which heavily depend on manual work or syntactic parsers and are inefficient or error-cascading. Their development has encountered bottlenecks. Although a few people try to use neural network based models to… Expand
Language-consistent Open Relation Extraction: from Multilingual Text Corpora
TLDR
An extensive evaluation performed on 5 languages shows that LOREM outperforms state-of-the-art monolingual and cross-lingual open relation extractors and generalizes to other languages than the languages that it is trained on. Expand
Semi-supervised Open Domain Information Extraction with Conditional VAE
Open information extraction (OpenIE) is the task of extracting open-domain assertions from natural language sentences. However, the lack of annotated data hurts the performance of current model andExpand
MCVAE: Margin-based Conditional Variational Autoencoder for Relation Classification and Pattern Generation
TLDR
A generative model, called conditional variational autoencoder (CVAE), is proposed, which leads to a margin-based CVAE (MCVAE) that can significantly enhance the classification ability and automatically generate semantically meaningful patterns that describe the given relations. Expand
LSOIE: A Large-Scale Dataset for Supervised Open Information Extraction
TLDR
This work introduces a new dataset by converting the QA-SRL 2.0 dataset to a large-scale OIE dataset LSOIE, which is 20 times larger than the next largest human-annotated OIE datasets. Expand
Improving Open Information Extraction via Iterative Rank-Aware Learning
TLDR
This work finds that the extraction likelihood, a confidence measure used by current supervised open IE systems, is not well calibrated when comparing the quality of assertions extracted from different sentences, and proposes an additional binary classification loss to calibrate the likelihood to make it more globally comparable. Expand
LOREM: Language-consistent Open Relation Extraction from Unstructured Text
TLDR
LOREM does not rely on language-specific knowledge or external NLP tools such as translators or PoS-taggers, and exploits information and structures that are consistent over different languages, which allows the model to be easily extended with only limited training efforts to new languages, but also provides a boost to performance for a given single language. Expand
Open Relation Extraction: Relational Knowledge Transfer from Supervised Data to Unsupervised Data
TLDR
This work proposes Relational Siamese Networks (RSNs) to learn similarity metrics of relations from labeled data of pre-defined relations, and then transfer the relational knowledge to identify novel relations in unlabeled data. Expand
CaRB: A Crowdsourced Benchmark for Open IE
TLDR
CaRB is the first crowdsourced Open IE dataset and it also makes substantive changes in the matching code and metrics, and it finds that on one pair of Open IE systems, CaRB framework provides contradictory results to OIE2016. Expand

References

SHOWING 1-10 OF 69 REFERENCES
Leveraging Linguistic Structure For Open Domain Information Extraction
TLDR
This work replaces this large pattern set with a few patterns for canonically structured sentences, and shifts the focus to a classifier which learns to extract self-contained clauses from longer sentences to determine the maximally specific arguments for each candidate triple. Expand
Distant supervision for relation extraction without labeled data
TLDR
This work investigates an alternative paradigm that does not require labeled corpora, avoiding the domain dependence of ACE-style algorithms, and allowing the use of corpora of any size. Expand
Supervised Open Information Extraction
TLDR
A novel formulation of Open IE as a sequence tagging problem, addressing challenges such as encoding multiple extractions for a predicate, and a supervised model that outperforms the existing state-of-the-art Open IE systems on benchmark datasets. Expand
Relation Classification via Convolutional Deep Neural Network
TLDR
This paper exploits a convolutional deep neural network (DNN) to extract lexical and sentence level features from the output of pre-existing natural language processing systems and significantly outperforms the state-of-the-art methods. Expand
Chinese Open Relation Extraction and Knowledge Base Establishment
TLDR
This article proposes a novel unsupervised linguistics-based Chinese ORE model based on Dependency Semantic Normal Forms (DSNFs), which can automatically discover arbitrary relations without any manually labeled datasets, and establishes a large-scale corpus of entity and relation. Expand
Open Information Extraction Using Wikipedia
TLDR
WOE is presented, an open IE system which improves dramatically on TextRunner's precision and recall and is a novel form of self-supervised learning for open extractors -- using heuristic matches between Wikipedia infobox attribute values and corresponding sentences to construct training data. Expand
End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF
TLDR
A novel neutral network architecture is introduced that benefits from both word- and character-level representations automatically, by using combination of bidirectional LSTM, CNN and CRF, thus making it applicable to a wide range of sequence labeling tasks. Expand
MinIE: Minimizing Facts in Open Information Extraction
TLDR
An experimental study with several real-world datasets found that MinIE achieves competitive or higher precision and recall than most prior systems, while at the same time producing shorter, semantically enriched extractions. Expand
Named Entity Recognition with Bidirectional LSTM-CNNs
TLDR
A novel neural network architecture is presented that automatically detects word- and character-level features using a hybrid bidirectional LSTM and CNN architecture, eliminating the need for most feature engineering. Expand
Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths
TLDR
This paper presents SDP-LSTM, a novel neural network to classify the relation of two entities in a sentence, which leverages the shortest dependency path (SDP) between two entities; multichannel recurrent neural networks, with long short term memory (L STM) units, pick up heterogeneous information along the SDP. Expand
...
1
2
3
4
5
...