Neural Medication Extraction: A Comparison of Recent Models in Supervised and Semi-supervised Learning Settings

@article{Kocabiyikoglu2021NeuralME,
  title={Neural Medication Extraction: A Comparison of Recent Models in Supervised and Semi-supervised Learning Settings},
  author={Ali Can Kocabiyikoglu and Jean-Marc Babouchkine and François Portet and Raheel Qader},
  journal={2021 IEEE 9th International Conference on Healthcare Informatics (ICHI)},
  year={2021},
  pages={148-152}
}
Drug prescriptions are essential information that must be encoded in electronic medical records. However, much of this information is hidden within free-text reports. This is why the medication extraction task has emerged. To date, most of the research effort has focused on small amount of data and has only recently considered deep learning methods. In this paper, we present an independent and comprehensive evaluation of state-of-the-art neural architectures on the I2B2 medical prescription… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 33 REFERENCES
2018 N2c2 Shared Task on Adverse Drug Events and Medication Extraction in Electronic Health Records
TLDR
This challenge shows that clinical concept extraction and relation classification systems have a high performance for many concept types, but significant improvement is still required for ADEs and Reasons.
Adverse drug event detection and extraction from open data: A deep learning approach
TLDR
The results show that a BERT-based model achieves new state-of-the-art results on both the ADE detection and extraction task and can be applied to multiple other healthcare and information extraction tasks including medical entity extraction and entity recognition.
Exploring Semi-supervised Variational Autoencoders for Biomedical Relation Extraction
TLDR
The results suggest that exploiting such unlabeled data can be greatly beneficial to improved performance in various biomedical relation extraction, especially when only limited labeled data is available in such tasks.
Prescription extraction using CRFs and word embeddings
TLDR
This work presents a machine learning approach to extract and organize medication names and prescription information into individual entries and achieves a horizontal phrase-level F1-measure of 0.864, which to the best of the knowledge represents an improvement over the current state-of-the-art.
Evaluation of Transfer Learning for Adverse Drug Event (ADE) and Medication Entity Extraction
TLDR
A sentence-augmentation method for enhanced ADE identification benefiting BERT-based and ELMo-based models by up to 3.13% in F1 gains is developed and it is shown that a simple ensemble of these models out-paces most current methods in ADE extraction.
FABLE: A Semi-Supervised Prescription Information Extraction System
TLDR
FABLE, a system for automatically extracting prescription information from discharge summaries that utilizes unannotated data to enhance annotated training data and performs semi-supervised extraction of medication information using pseudo-labels with Conditional Random Fields (CRFs) to improve its understanding of incomplete, sparse, and diverse medication entities.
Assessment of Amazon Comprehend Medical: Medication Information Extraction
TLDR
This work focuses on the medication extraction task, and particularly, Amazon Comprehend Medical (ACM), a deep learning based system that automatically extracts clinical concepts from clinical text notes was evaluated using the official test sets from the 2009 i2b2 Medication Extraction Challenge and 2018 n2c2 Track 2: Adverse Drug Events and Medication extraction in EHRs.
Semi-supervised method for biomedical event extraction
TLDR
Limited labeled data could be combined with unlabeled data to tackle the data sparseness problem by means of the introduced EFCG approach, and the classified capability of the model was enhanced through establishing a rich feature set by both labeled and unlabeling datasets.
MT-Clinical BERT: Scaling Clinical Information Extraction with Multitask Learning
TLDR
Multitask-Clinical BERT is developed, a single deep learning model that simultaneously performs 8 clinical tasks spanning entity extraction, personal health information identification, language entailment, and similarity by sharing representations among tasks, and performs competitively with all state-of-the-art task-specific systems.
Enhancing Clinical Concept Extraction with Contextual Embedding
TLDR
The potential of contextual embeddings is demonstrated through the state-of-the-art performance these methods achieve on clinical concept extraction and the impact of the pretraining time of a large language model like ELMo or BERT is analyzed.
...
1
2
3
4
...