• Publications
  • Influence
Large-scale evidence of dependency length minimization in 37 languages
TLDR
We provide the first large-scale, quantitative, cross-linguistic evidence for a universal syntactic property of languages: that dependency lengths are shorter than chance. Expand
  • 172
  • 17
  • PDF
A meta-analysis of syntactic priming in language production
Abstract We performed an exhaustive meta-analysis of 73 peer-reviewed journal articles on syntactic priming from the seminal Bock (1986) paper through 2013. Extracting the effect size for eachExpand
  • 92
  • 9
  • PDF
What do RNN Language Models Learn about Filler-Gap Dependencies?
TLDR
We investigate whether state-of-the-art RNN language models represent long-distance filler-gap dependencies and constraints on them. Expand
  • 65
  • 7
  • PDF
Quantifying Word Order Freedom in Dependency Corpora
TLDR
We present novel measures of a key quantitative property of language, word order freedom: the extent to which word order in a sentence is free to vary while conveying the same meaning. Expand
  • 40
  • 7
  • PDF
Don’t Underestimate the Benefits of Being Misunderstood
Being a nonnative speaker of a language poses challenges. Individuals often feel embarrassed by the errors they make when talking in their second language. However, here we report an advantage ofExpand
  • 20
  • 6
  • PDF
Color naming across languages reflects color use
Significance The number of color terms varies drastically across languages. Yet despite these differences, certain terms (e.g., red) are prevalent, which has been attributed to perceptual salience.Expand
  • 90
  • 5
  • PDF
What Syntactic Structures block Dependencies in RNN Language Models?
TLDR
We demonstrate that RNN language models are sensitive to hierarchical syntactic structure by investigating the filler--gap dependency and constraints on it, known as syntactic islands. Expand
  • 13
  • 5
  • PDF
How Efficiency Shapes Human Language
Cognitive science applies diverse tools and perspectives to study human language. Recently, an exciting body of work has examined linguistic phenomena through the lens of efficiency in usage: whatExpand
  • 66
  • 3
  • PDF
Neural Language Models as Psycholinguistic Subjects: Representations of Syntactic State
TLDR
We investigate the extent to which the behavior of neural network language models reflects incremental representations of syntactic state. Expand
  • 38
  • 3
  • PDF
The Natural Stories Corpus
TLDR
We introduce and release a new corpus consisting of English texts edited to contain many low-frequency and psycholinguistically interesting syntactic constructions while still sounding fluent to native speakers. Expand
  • 24
  • 3
  • PDF
...
1
2
3
4
5
...