Corpus ID: 236087310

Clinical Relation Extraction Using Transformer-based Models

  title={Clinical Relation Extraction Using Transformer-based Models},
  author={Xi Yang and Zehao Yu and Yi Guo and Jiang Bian and Yonghui Wu},
  • Xi Yang, Zehao Yu, +2 authors Yonghui Wu
  • Published 2021
  • Computer Science
  • ArXiv
The newly emerged transformer technology has a tremendous impact on NLP research. In the general English domain, transformer-based models have achieved state-of-the-art performances on various NLP benchmarks. In the clinical domain, researchers also have investigated transformer models for clinical applications. The goal of this study is to systematically explore three widely used transformer-based models (i.e., BERT, RoBERTa, and XLNet) for clinical relation extraction and develop an open… Expand
1 Citations

Figures and Tables from this paper

Deep learning models in detection of dietary supplement adverse event signals from Twitter
The feasibility of detecting DS AE signals from Twitter with a BioBERT-based deep learning pipeline is demonstrated and the BERT-based models outperformed traditional word embeddings. Expand


Clinical concept extraction using transformers
The efficiency of transformer-based models for clinical concept extraction is demonstrated and an open-source package with pretrained clinical models to facilitate concept extraction and other downstream natural language processing (NLP) tasks in the medical domain is developed. Expand
Multiple features for clinical relation extraction: A machine learning approach
This work proposes a machine learning model with a novel set of knowledge-based and BioSentVec embedding features that obtains state-of-the-art performance in clinical relation extraction and demonstrates that distance and word features provide significant benefits to the classifier. Expand
2018 N2c2 Shared Task on Adverse Drug Events and Medication Extraction in Electronic Health Records
This challenge shows that clinical concept extraction and relation classification systems have a high performance for many concept types, but significant improvement is still required for ADEs and Reasons. Expand
Relation Extraction from Clinical Narratives Using Pre-trained Language Models
Two different implementations of the BERT model for clinical RE tasks are developed, showing that tuned LMs outperformed previous state-of-the-art RE systems in two shared tasks, which demonstrates the potential of LM-based methods on the RE task. Expand
Leveraging Contextual Information in Extracting Long Distance Relations from Clinical Notes
Two new improved methods for relation extraction are studied: State-of-the-art deep learning contextual representation model called BERT, Bidirectional Encoder Representations from Transformers and Selection of negative training samples based on the "near-miss" hypothesis (the Edge sampling). Expand
Enhancing Clinical Concept Extraction with Contextual Embedding
The potential of contextual embeddings is demonstrated through the state-of-the-art performance these methods achieve on clinical concept extraction and the impact of the pretraining time of a large language model like ELMo or BERT is analyzed. Expand
Clinical Relation Extraction with Deep Learning
The task of relation extraction is based on the task of concept recognition and is implemented as relation classification by the adoption of a CRF model and the method of features optimization with the deep learning model shows the great potential. Expand
Extracting Family History of Patients From Clinical Narratives: Exploring an End-to-End Solution With Deep Learning Models
This study demonstrated the feasibility of utilizing deep learning methods to extract FH information from clinical narratives through machine learning–based systems without task-specific rules created by hand. Expand
Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets
The Biomedical Language Understanding Evaluation (BLUE) benchmark is introduced to facilitate research in the development of pre-training language representations in the biomedicine domain and it is found that the BERT model pre-trained on PubMed abstracts and MIMIC-III clinical notes achieves the best results. Expand
Deep learning in clinical natural language processing: a methodical review
Deep learning has not yet fully penetrated clinical NLP and is growing rapidly, but growing acceptance of deep learning as a baseline for NLP research, and of DL-based NLP in the medical community is shown. Expand