Rethinking language: How probabilities shape the words we use

@article{Griffiths2011RethinkingLH,
  title={Rethinking language: How probabilities shape the words we use},
  author={T. Griffiths},
  journal={Proceedings of the National Academy of Sciences},
  year={2011},
  volume={108},
  pages={3825 - 3826}
}
  • T. Griffiths
  • Published 2011
  • Computer Science, Medicine
  • Proceedings of the National Academy of Sciences
If you think about the classes you expect to take when studying linguistics in graduate school, probability theory is unlikely to be on the list. However, recent work in linguistics and cognitive science has begun to show that probability theory, combined with the methods of computer science and statistics, is surprisingly effective in explaining aspects of how people produce and interpret sentences (1–3), how language might be learned (4–6), and how words change over time (7, 8). The paper by… Expand
Probabilistic language models in cognitive neuroscience: promises and pitfalls
Probabilistic language models in cognitive neuroscience: Promises and pitfalls
Iterated learning of language distributions
Emergent linguistic structure in artificial neural networks trained by self-supervision
Positive words carry less information than negative words
The distribution of information content in English sentences
Information gain modulates brain activity evoked by reading
A concept of semantics extraction from web data by induction of fuzzy ontologies
...
1
2
...

References

SHOWING 1-10 OF 32 REFERENCES
Formal grammar and information theory: together again?
  • Fernando C Pereira
  • Mathematics
  • Philosophical Transactions of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences
  • 2000
Word learning as Bayesian inference.
Word lengths are optimized for efficient communication
Improved Reconstruction of Protolanguage Word Forms
...
1
2
3
4
...