Language Models Learn POS First

@inproceedings{Saphra2018LanguageML,
  title={Language Models Learn POS First},
  author={Naomi Saphra and Adam Lopez},
  booktitle={BlackboxNLP@EMNLP},
  year={2018}
}
A glut of recent research shows that language models capture linguistic structure. Linzen et al. (2016) found that LSTM-based language models may encode syntactic information sufficient to favor verbs which match the number of their subject nouns. Liu et al. (2018) suggested that the high performance of LSTMs may depend on the linguistic structure of the input data, as performance on several artificial tasks was higher with natural language data than with artificial sequential data. Such work… CONTINUE READING