Pretrained Language Models for Sequential Sentence Classification

@article{Cohan2019PretrainedLM,
  title={Pretrained Language Models for Sequential Sentence Classification},
  author={Arman Cohan and Iz Beltagy and Daniel King and Bhavana Dalvi and Daniel S. Weld},
  journal={ArXiv},
  year={2019},
  volume={abs/1909.04054}
}
As a step toward better document-level understanding, we explore classification of a sequence of sentences into their corresponding categories, a task that requires understanding sentences in context of the document. Recent successful models for this task have used hierarchical models to contextualize sentence representations, and Conditional Random Fields (CRFs) to incorporate dependencies between subsequent labels. In this work, we show that pretrained language models, BERT (Devlin et al… CONTINUE READING

Figures, Tables, and Topics from this paper.

References

Publications referenced by this paper.
SHOWING 1-10 OF 27 REFERENCES

Attention Is All You Need

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Automatic classification of sentences to support evidence based medicine

Su Kim, David Martı́nez, Lawrence Cavedon, Lars Yencken
  • In BMC Bioinformatics
  • 2011
VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL