Integration of visual and linguistic information in spoken language comprehension.

  title={Integration of visual and linguistic information in spoken language comprehension.},
  author={Michael K. Tanenhaus and Michael J. Spivey-Knowlton and Kathleen M. Eberhard and Julie C. Sedivy},
  volume={268 5217},
Psycholinguists have commonly assumed that as a spoken linguistic message unfolds over time, it is initially structured by a syntactic processing module that is encapsulated from information provided by other perceptual and cognitive systems. To test the effects of relevant visual context on the rapid mental processes that accompany spoken language comprehension, eye movements were recorded with a head-mounted eye-tracking system while subjects followed instructions to manipulate real objects… 

Eye Movements and Lexical Access in Spoken-Language Comprehension: Evaluating a Linking Hypothesis between Fixations and Linguistic Processing

The results provide evidence about the time course of lexical activation that resolves some important theoretical issues in spoken-word recognition and demonstrate that fixations are sensitive to properties of the normal language-processing system that cannot be attributed to task-specific strategies.

An information-seeking account of eye movements during spoken and signed language comprehension

The hypothesize that eye movements during language comprehension represent an adaptive response, and data suggest that people adapt to the value of seeking different information in order to increase the chance of rapid and accurate language understanding.

Eye movements as a window into real-time spoken language comprehension in natural contexts

It is argued that context affected the earliest moments of language processing because it was highly accessible and relevant to the behavioral goals of the listener.

A visual context-aware multimodal system for spoken language processing

This work presents a real-time multimodal system motivated by findings that performs early integration of visual contextual information to recognize the most likely word sequences in spoken language utterances.

Conflicting Constraints in Resource-Adaptive Language Comprehension

Experimental findings conspire to paint a picture in which purely linguistic constraints can in fact be overridden by highly contextual aspects of the situation, such as the intonation contour of a particular utterance, semantic expectations supported by the visual scene, and indeed events going on in the scene itself.

Spontaneous eye movements during passive spoken language comprehension reflect grammatical processing

Language is tightly connected to sensory and motor systems. Recent research using eye- tracking typically relies on constrained visual contexts, viewing a small array of objects on a computer screen.

Flexible Use of Phonological and Visual Memory in Language-mediated Visual Search

This work investigates when and how listeners use object names in visual-search strategies across three visual world experiments, varying the presence and location of an added visual memory demand.

Linguistically Mediated Visual Search

It is found that when a conjunction target was identified by a spoken instruction presented concurrently with the visual display, the incremental processing of spoken language allowed the search process to proceed in a manner considerably less affected by the number of distractors.



Rules of language.

Intensive study of one phenomenon of English grammar and how it is processed and acquired suggest that both theories of language and cognition are partly right.

Linguistic Structure and Speech Shadowing at Very Short Latencies

This paper presents an experimental task in which the subject is required to repeat (shadow) speech as he hears it, and the response latency to each word of a sentence is measured.

Sentence Perception as an Interactive Parallel Process

The restoration of disrupted words to their original form in a sentence shadowing task is dependent upon semantic and syntactic context variables, thus demonstrating an on-line interaction between

The lexical nature of syntactic ambiguity resolution

Reinterpreting syntactic ambiguity resolution as a form of lexical ambiguity resolution obviates the need for special parsing principles to account for syntactic interpretation preferences, and provides a more unified account of language comprehension than was previously available.

The Interaction of Referential Ambiguity and Argument Structure in the Parsing of Prepositional Phrases

Abstract This research addresses the question of whether the initial interpretation of structurally ambiguous sentences (such as He dropped the book on the chair ) is made purely on the basis of

Modularity of mind

Languages of the Mind