Learn More
A sudden touch on one hand can improve vision near that hand, revealing crossmodal links in spatial attention. It is often assumed that such links involve only multimodal neural structures, but unimodal brain areas may also be affected. We tested the effect of simultaneous visuo-tactile stimulation on the activity of the human visual cortex. Tactile(More)
Speech perception can use not only auditory signals, but also visual information from seeing the speaker's mouth. The relative timing and relative location of auditory and visual inputs are both known to influence crossmodal integration psychologically, but previous imaging studies of audiovisual speech focused primarily on just temporal aspects. Here we(More)
How do we perceive the visual motion of objects that are accelerated by gravity? We propose that, because vision is poorly sensitive to accelerations, an internal model that calculates the effects of gravity is derived from graviceptive information, is stored in the vestibular cortex, and is activated by visual motion that appears to be coherent with(More)
During covert attention to peripheral visual targets, presenting a concurrent tactile stimulus at the same location as a visual target can boost neural responses to it, even in sensory-specific occipital areas. Here, we examined any such crossmodal spatial-congruence effects in the context of overt spatial orienting, when saccadic eye-movements were(More)
In everyday life, people untrained in formal logic draw simple deductive inferences from linguistic material (i.e., elementary propositional deductions). Presently, we have limited information on the brain areas implicated when such conclusions are drawn. We used event-related fMRI to identify these brain areas. A set of multiple and independent criteria(More)
Perception of movement in acoustic space depends on comparison of the sound waveforms reaching the two ears (binaural cues) as well as spectrotemporal analysis of the waveform at each ear (monaural cues). The relative importance of these two cues is different for perception of vertical or horizontal motion, with spectrotemporal analysis likely to be more(More)
Our brain continuously receives complex combinations of sounds originating from different sources and relating to different events in the external world. Timing differences between the two ears can be used to localize sounds in space, but only when the inputs to the two ears have similar spectrotemporal profiles (high binaural coherence). We used fMRI to(More)
Two identical stimuli, such as a pair of electrical shocks to the skin, are readily perceived as two separate events in time provided the interval between them is sufficiently long. However, as they are presented progressively closer together, there comes a point when the two separate stimuli are perceived as a single stimulus. Damage to posterior parietal(More)
Deduction allows us to draw consequences from previous knowledge. Deductive reasoning can be applied to several types of problem, for example, conditional, syllogistic, and relational. It has been assumed that the same cognitive operations underlie solutions to them all; however, this hypothesis remains to be tested empirically. We used event-related fMRI,(More)
Correlated fluctuations of low-frequency fMRI signal have been suggested to reflect functional connectivity among the involved regions. However, large-scale correlations are especially prone to spurious global modulations induced by coherent physiological noise. Cardiac and respiratory rhythms are the most offending component, and a tailored preprocessing(More)