Modeling Relations and Their Mentions without Labeled Text

@inproceedings{Riedel2010ModelingRA,
  title={Modeling Relations and Their Mentions without Labeled Text},
  author={Sebastian Riedel and Limin Yao and Andrew McCallum},
  booktitle={ECML/PKDD},
  year={2010}
}
Several recent works on relation extraction have been applying the distant supervision paradigm: instead of relying on annotated text to learn how to predict relations, they employ existing knowledge bases (KBs) as source of supervision. Crucially, these approaches are trained based on the assumption that each sentence which mentions the two related entities is an expression of the given relation. Here we argue that this leads to noisy patterns that hurt precision, in particular if the… CONTINUE READING

Figures, Tables, Results, and Topics from this paper.

Key Quantitative Results

  • When compared to 87% precision for a model based on the distant supervision assumption, this amounts to 31% error reduction.

Citations

Publications citing this paper.
SHOWING 1-10 OF 480 CITATIONS

Relation Classification Slot Filling

VIEW 11 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

A Survey of Distant Supervision Methods using PGMs

  • ArXiv
  • 2017
VIEW 9 EXCERPTS
CITES BACKGROUND, METHODS & RESULTS
HIGHLY INFLUENCED

Frame-Based Semantic Patterns for Relation Extraction

VIEW 8 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Distant Supervision for Relation Extraction with Ranking-Based Methods

VIEW 9 EXCERPTS
CITES RESULTS, METHODS & BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2010
2019

CITATION STATISTICS

  • 115 Highly Influenced Citations

  • Averaged 82 Citations per year from 2017 through 2019

References

Publications referenced by this paper.
SHOWING 1-10 OF 28 REFERENCES

The New York Times Annotated Corpus

E. Sandhaus
  • Linguistic Data Consortium, Philadelphia
  • 2008
VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL