Simultaneous and independent acquisition of multisensory and unisensory associations.

Abstract

Although humans are almost constantly exposed to stimuli from multiple sensory modalities during daily life, the processes by which we learn to integrate information from multiple senses to acquire knowledge of multisensory objects are not well understood. Here, we present results of a novel audio-visual statistical learning procedure where participants are passively exposed to a rapid serial presentation of arbitrary audio-visual pairings (comprised of artificial/ synthetic audio and visual stimuli). Following this exposure, participants were tested with a two-interval forced-choice procedure in which their degree of familiarity with the experienced audio-visual pairings was evaluated against novel audio-visual combinations drawn from the same stimulus set. Our results show that subjects acquire knowledge of visual-visual, audio-audio, and audio-visual stimulus associations and that the learning of these types of associations occurs in an independent manner.

Statistics

051015200920102011201220132014201520162017
Citations per Year

53 Citations

Semantic Scholar estimates that this publication has 53 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Seitz2007SimultaneousAI, title={Simultaneous and independent acquisition of multisensory and unisensory associations.}, author={Aaron R. Seitz and Robyn S. Kim and Virginie van Wassenhove and Ladan Shams}, journal={Perception}, year={2007}, volume={36 10}, pages={1445-53} }