Intermodal event files: integrating features across vision, audition, taction, and action

  title={Intermodal event files: integrating features across vision, audition, taction, and action},
  author={Sharon Zmigrod and Michiel M. A. Spap{\'e} and Bernhard Hommel},
  journal={Psychological Research},
  pages={674 - 684}
Understanding how the human brain integrates features of perceived events calls for the examination of binding processes within and across different modalities and domains. Recent studies of feature-repetition effects have demonstrated interactions between shape, color, and location in the visual modality and between pitch, loudness, and location in the auditory modality: repeating one feature is beneficial if other features are also repeated, but detrimental if not. These partial-repetition… CONTINUE READING

From This Paper

Figures, tables, and topics from this paper.

Explore Further: Topics Discussed in This Paper


Publications citing this paper.
Showing 1-10 of 26 extracted citations


Publications referenced by this paper.
Showing 1-10 of 32 references

How information of relevant dimension control the creation and retrieval of feature-response binding, under revision

B. Hommel, J. Memelink, S. Zmigrod, L. S. Colzato
View 2 Excerpts

Ed.). Sensation and perception (7th ed.)

E. B. Goldstein
View 1 Excerpt

The development of cortical multisensory integration.

The Journal of neuroscience : the official journal of the Society for Neuroscience • 2006

What do we learn from binding features? Evidence for multilevel feature integration

L. S. Colzato, A. RaVone, B. Hommel
Journal of Experimental Psychology: Human Perception and Performance, • 2006
View 3 Excerpts

How much attention does an event file need?

Journal of experimental psychology. Human perception and performance • 2005
View 2 Excerpts

Similar Papers

Loading similar papers…