The Now-or-Never bottleneck: A fundamental constraint on language

  title={The Now-or-Never bottleneck: A fundamental constraint on language},
  author={Morten H. Christiansen and Nick Chater},
  journal={Behavioral and Brain Sciences},
Abstract Memory is fleeting. New material rapidly obliterates previous material. How, then, can the brain deal successfully with the continual deluge of linguistic input? We argue that, to deal with this “Now-or-Never” bottleneck, the brain must compress and recode linguistic input as rapidly as possible. This observation has strong implications for the nature of language processing: (1) the language system must “eagerly” recode and compress linguistic input; (2) as the bottleneck recurs at… 

Chunk-Based Memory Constraints on the Cultural Evolution of Language

This work highlights the contribution of a fundamental constraint on processing, the Now-or-Never bottleneck, and suggests that basic chunking mechanisms to rapidly compress and recode incoming linguistic input into increasingly abstract levels of representation influence linguistic structure across multiple time scales.

Language as skill: Intertwining comprehension and production

When language comprehension goes wrong for the right reasons: Good-enough, underspecified, or shallow language processing

This paper contains an overview of language processing that can be described as “good enough’, “underspecified”, or “shallow”; connections are made between this relatively recent facet of psycholinguistic study, other recent language processing models, and related concepts in other areas of cognitive science.

Natural chunk-and-pass language processing: Just another joint source-channel coding model?

  • K. Clark
  • Computer Science
    Communicative & integrative biology
  • 2018
The authors' “Chunk-and-Pass” processing putatively mitigates the severe multilevel Now-or-Never bottleneck via fast linguistic coding and compression, hierarchical language representation and pattern duality, and incrementally learned item-based predictions useful for grammaticalization over wide spacetime scales.

Composition is the Core Driver of the Language-selective Network

This finding demonstrates that composition is robust to word order violations, and that the language regions respond as strongly as they do to naturalistic linguistic input, providing that composition can take place.

Against stored abstractions: A radical exemplar model of language acquisition

The goal of this article is to make the case for a radical exemplar account of child language acquisition, under which unwitnessed forms are produced and comprehended by on-the-fly analogy across

More Than Words: The Role of Multiword Sequences in Language Learning and Use

This special topic brings together cutting-edge work on multiword sequences in theoretical linguistics, first-language acquisition, psycholinguistics, computational modeling, and second-language learning to present a comprehensive overview of the prominence and importance of such units in language.

Temporality in speech – Linear Unit Grammar

Language is usually modelled through a predominantly synoptic perspective; even if the object of analysis is spoken language, we tend to look at extracts where the analysis of parts makes use of the

Limits on prediction in language comprehension: A multi-lab failure to replicate evidence for probabilistic pre-activation of phonology

These findings do not support a strong prediction view in which people routinely pre-activate the phonological form of upcoming words, and suggest a more limited role for prediction during language comprehension.



Linguistic complexity: locality of syntactic dependencies

Incrementality and Prediction in Human Sentence Processing

A number of principles with respect to prediction that underpin adult language comprehension are identified and the relationship between prediction, event structure, thematic role assignment, and incrementality is discussed.

The emergence of grammaticality in connectionist networks.

Linguistic theory in the generative tradition is based on a small number of simple but important observations about human languages and how they are acquired. First, the structure of language is

A Usage-Based Approach to Recursion in Sentence Processing

A connectionist model embodying this alternative theory is outlined, along with simulation results showing that the model is capable of constituent-like generalizations and that it can fit human data regarding the differential processing difficulty associated with center-embeddings in German and cross-dependencies in Dutch.

The neurobiology of syntax: beyond string sets

  • K. PeterssonP. Hagoort
  • Computer Science, Biology
    Philosophical Transactions of the Royal Society B: Biological Sciences
  • 2012
The brain represents grammars in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing, and the acquisition of this ability is accounted for in an adaptive dynamical systems framework.

Bigrams and the Richness of the Stimulus

This article reports experiments based on those of Reali and Christiansen (2005), who demonstrated that a simple bigram language model can induce the correct form of auxiliary inversion in certain complex questions, and investigates the nature of the indirect evidence that supports this learning, and how reliably it is available.

Functional parallelism in spoken word-recognition

Toward a connectionist model of recursion in human linguistic performance