Language models and Automated Essay Scoring
@article{Rodriguez2019LanguageMA, title={Language models and Automated Essay Scoring}, author={P. Rodriguez and A. Jafari and Christopher M. Ormerod}, journal={ArXiv}, year={2019}, volume={abs/1909.09482} }
In this paper, we present a new comparative study on automatic essay scoring (AES. [...] Key Method We elucidate the network architectures of BERT and XLNet using clear notation and diagrams and explain the advantages of transformer architectures over traditional recurrent neural network architectures. Linear algebra notation is used to clarify the functions of transformers and attention mechanisms. We compare the results with more traditional methods, such as bag of words (BOW) and long short term memory…Expand Abstract
Figures, Tables, and Topics from this paper
5 Citations
Automated essay scoring using efficient transformer-based language models
- Computer Science
- ArXiv
- 2021
- PDF
References
SHOWING 1-10 OF 34 REFERENCES
Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
- Computer Science, Mathematics
- ACL
- 2019
- 865
- PDF
XLNet: Generalized Autoregressive Pretraining for Language Understanding
- Computer Science
- NeurIPS
- 2019
- 2,040
- Highly Influential
- PDF