• Computer Science
  • Published in NAACL-HLT 2019

Studying the Inductive Biases of RNNs with Synthetic Variations of Natural Languages

@inproceedings{Ravfogel2019StudyingTI,
  title={Studying the Inductive Biases of RNNs with Synthetic Variations of Natural Languages},
  author={Shauli Ravfogel and Yoav Goldberg and Tal Linzen},
  booktitle={NAACL-HLT},
  year={2019}
}
How do typological properties such as word order and morphological case marking affect the ability of neural sequence models to acquire the syntax of a language? Cross-linguistic comparisons of RNNs' syntactic performance (e.g., on subject-verb agreement prediction) are complicated by the fact that any two languages differ in multiple typological properties, as well as by differences in training corpus. We propose a paradigm that addresses these issues: we create synthetic versions of English… CONTINUE READING
3
Twitter Mentions

References

Publications referenced by this paper.
SHOWING 1-10 OF 38 REFERENCES

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

VIEW 6 EXCERPTS

A noisy-channel account of crosslinguistic word-order variation.

VIEW 4 EXCERPTS
HIGHLY INFLUENTIAL

Order of subject, object, and verb

VIEW 3 EXCERPTS
HIGHLY INFLUENTIAL