The cerebral bases of the bouba-kiki effect

  title={The cerebral bases of the bouba-kiki effect},
  author={Nathan Peiffer-Smadja and Laurent D. Cohen},

Figures from this paper

Neural Basis of the Sound-Symbolic Crossmodal Correspondence Between Auditory Pseudowords and Visual Shapes

It is suggested that the observed incongruency effects are likely to reflect phonological processing and/or multisensory attention, and advance the understanding of sound-to-meaning mapping in the brain.

Steady state visual evoked potentials reveal a signature of the pitch-size crossmodal association in visual cortex

A pitch-size congruency effect in the SSVEP over occipital electrodes and in the behavioural data is revealed, supporting a low-level account of this crossmodal association and suggesting this signature of the pitch size association in early visual cortices reflects the successful pairing of congruent visual and acoustic object properties.

Seeing Sounds: The Role of Vowels and Consonants in Crossmodal Correspondences

Crossmodal correspondences refer to the fact that certain domains of features in different sensory modalities are associated with each other. Here, we investigated the crossmodal correspondences

Brain networks underlying the processing of sound symbolism related to softness perception

Results indicate that the insula and medial superior frontal gyrus play a role in processing sound symbolic information and relating it to the tactile softness information.

The Bouba-Kiki Effect in Children With Childhood Apraxia of Speech.

The reduced BK effect in children with CAS supports the notion that cross-modal sensory integration may be altered in these children, which may contribute to speech difficulties in CAS.

Global shape perception contributes to crossmodal correspondences.

The Bouba/Kiki effect was susceptible to the factors modulating the perceptual grouping process going from segments to the global contour, suggesting that the Bouba / Kiki effect may occur at the global level of shape perception.

Cross-Modal Correspondence Between Speech Sound and Visual Shape Influencing Perceptual Representation of Shape: the Role of Articulation and Pitch.

It is found that sound-shape correspondence exerts an effect on shape representation by modulating audiovisual interaction, but only in the case of pitch-varying sounds.

The Role and Priming Effect of Pre-Acquired Memories in Abstract Decision-Making

From a neuropsychological perspective, the brain is confronted daily by decision-making processes. Decision-making is influenced by many factors, from biological stimuli to reward assessments. In

Stimulus Parameters Underlying Sound-Symbolic Mapping of Auditory Pseudowords to Visual Shapes

This work establishes the utility of RSA for analysis of large stimulus sets and offers novel insights into the stimulus parameters underlying sound symbolism, showing that sound-to-shape mapping is driven by acoustic properties of pseudowords and suggesting audiovisual cross-modal correspondence as a basis for language users' sensitivity to this type of sound symbolism.



The effect of prior visual information on recognition of speech and sounds.

Effective connectivity analyses (dynamic causal modeling) suggest that these incongruency effects may emerge via greater bottom-up effects from early auditory regions to intermediate multisensory integration areas (i.e., STS and AG).

Phonological and orthographic influences in the bouba–kiki effect

The results of two studies suggest that the dominant mechanism underlying the bouba–kiki effect for literate subjects is matching based on aligning letter curvature and shape roundedness, which is strong enough to significantly influence word–shape associations even in auditory tasks, where written word forms are never presented to participants.

Cross-Modality Correspondence between Pitch and Spatial Location Modulates Attentional Orienting

The flexible contextual mapping between pitch and location, as well as its susceptibility to top–down control, suggests the pitch-induced cuing effect is primarily mediated by cognitive processes after initial sensory encoding and occurs at a relatively late stage of voluntary attention orienting.

Multisensory Integration in Speech Processing: Neural Mechanisms of Cross-Modal Aftereffects

Traditionally, perceptual neuroscience has focused on unimodal information processing. This is true also for investigations of speech processing, where the auditory modality was the natural focus of

A Comparison of Primate Prefrontal and Inferior Temporal Cortices during Visual Categorization

The ITC seems more involved in the analysis of currently viewed shapes, whereas the PFC showed stronger category signals, memory effects, and a greater tendency to encode information in terms of its behavioral meaning.

Natural cross-modal mappings between visual and auditory features.

In a series of speeded classification tasks, spontaneous mappings between the auditory feature of pitch and the visual features of vertical location, size, and spatial frequency are found but not contrast.

The effect of temporal asynchrony on the multisensory integration of letters and speech sounds.

The results reveal significant interactions between temporal proximity and content congruency in anterior and posterior auditory association cortex, indicating that temporal synchrony is critical for the integration of letters and speech sounds.

Crossmodal correspondences: A tutorial review

  • C. Spence
  • Psychology
    Attention, perception & psychophysics
  • 2011
The literature reviewed here supports the view thatCrossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help the authors' brains solve the crossmodal binding problem.