Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture

@article{Thillaisundaram2019BiomedicalRE,
  title={Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture},
  author={Ashok Thillaisundaram and Theodosia Togia},
  journal={ArXiv},
  year={2019},
  volume={abs/1909.12411}
}
This paper presents our participation in the AGAC Track from the 2019 BioNLP Open Shared Tasks. We provide a solution for Task 3, which aims to extract "gene - function change - disease" triples, where "gene" and "disease" are mentions of particular genes and diseases respectively and "function change" is one of four pre-defined relationship types. Our system extends BERT (Devlin et al., 2018), a state-of-the-art language model, which learns contextual language representations from a large… Expand
2 Citations
Deep Neural Approaches to Relation Triplets Extraction: A Comprehensive Survey
  • PDF
FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction
  • 2
  • Highly Influenced
  • PDF

References

SHOWING 1-10 OF 31 REFERENCES
A neural joint model for entity and relation extraction from biomedical text
  • 126
Simultaneously Self-Attending to All Mentions for Full-Abstract Biological Relation Extraction
  • 122
  • PDF
Improving chemical disease relation extraction with rich features and weakly labeled data
  • 48
Improving Relation Extraction by Pre-trained Language Representations
  • 27
  • Highly Influential
  • PDF
Chemical-induced disease relation extraction with various linguistic features
  • 31
  • PDF
Chemical-protein relation extraction with ensembles of SVM, CNN, and RNN models
  • 24
  • PDF
SciBERT: A Pretrained Language Model for Scientific Text
  • 384
  • Highly Influential
  • PDF
BioBERT: a pre-trained biomedical language representation model for biomedical text mining
  • 826
  • PDF
BioCreative V CDR task corpus: a resource for chemical disease relation extraction
  • 193
  • PDF
...
1
2
3
4
...