Pretrained Language Models for Sequential Sentence Classification
@article{Cohan2019PretrainedLM, title={Pretrained Language Models for Sequential Sentence Classification}, author={Arman Cohan and Iz Beltagy and Daniel King and Bhavana Dalvi and Daniel S. Weld}, journal={ArXiv}, year={2019}, volume={abs/1909.04054} }
As a step toward better document-level understanding, we explore classification of a sequence of sentences into their corresponding categories, a task that requires understanding sentences in context of the document. Recent successful models for this task have used hierarchical models to contextualize sentence representations, and Conditional Random Fields (CRFs) to incorporate dependencies between subsequent labels. In this work, we show that pretrained language models, BERT (Devlin et al… Expand
25 Citations
Sequential Span Classification with Neural Semi-Markov CRFs for Biomedical Abstracts
- Computer Science
- EMNLP
- 2020
- 1
- Highly Influenced
- PDF
UPSTAGE: Unsupervised Context Augmentation for Utterance Classification in Patient-Provider Communication
- Computer Science
- MLHC
- 2020
- PDF
Sequential Sentence Classification in Research Papers using Cross-Domain Multi-Task Learning
- Computer Science
- ArXiv
- 2021
- Highly Influenced
- PDF
Enhancing Automated Essay Scoring Performance via Fine-tuning Pre-trained Language Models with Combination of Regression and Ranking
- Computer Science
- EMNLP 2020
- 2020
- PDF
Enhancing Automated Essay Scoring Performance via Cohesion Measurement and Combination of Regression and Ranking
- Computer Science
- EMNLP
- 2020
- 1
- PDF
An Empirical Study on Explainable Prediction of Text Complexity: Preliminaries for Text Simplification
- Computer Science
- ArXiv
- 2020
- PDF
Improving Document-Level Sentiment Classification Using Importance of Sentences
- Computer Science, Medicine
- Entropy
- 2020
- 1
- PDF
References
SHOWING 1-10 OF 27 REFERENCES
Unified Language Model Pre-training for Natural Language Understanding and Generation
- Computer Science
- NeurIPS
- 2019
- 356
- PDF
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- Computer Science
- NAACL-HLT
- 2019
- 15,760
- PDF
A Supervised Approach to Extractive Summarisation of Scientific Papers
- Computer Science, Mathematics
- CoNLL
- 2017
- 40
- Highly Influential
- PDF