Learn More
Self-motion through an environment involves a composite of signals such as visual and vestibular cues. Building upon previous results showing that visual and vestibular signals combine in a statistically optimal fashion, we investigated the relative weights of visual and vestibular cues during self-motion. This experiment was comprised of three experimental(More)
The present study investigated the feasibility of acquiring high-density event-related brain potential (ERP) recordings during treadmill walking in human subjects. The work builds upon recent studies testing the applicability of real-world tasks while obtaining electroencephalographic (EEG) recordings. Participants performed a response inhibition GO/NOGO(More)
The simultaneous presentation of a stimulus in one sensory modality often enhances target detection in another sensory modality, but the neural mechanisms that govern these effects are still under investigation. Here, we test a hypothesis proposed in the neurophysiological literature: that auditory facilitation of visual-target detection operates through(More)
When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously(More)
Functional networks are comprised of neuronal ensembles bound through synchronization across multiple intrinsic oscillatory frequencies. Various coupled interactions between brain oscillators have been described (e.g., phase-amplitude coupling), but with little evidence that these interactions actually influence perceptual sensitivity. Here,(More)
Certain features of objects or events can be represented by more than a single sensory system, such as roughness of a surface (sight, sound, and touch), the location of a speaker (audition and sight), and the rhythm or duration of an event (by all three major sensory systems). Thus, these properties can be said to be sensory-independent or amodal. A key(More)
Findings in animal models demonstrate that activity within hierarchically early sensory cortical regions can be modulated by cross-sensory inputs through resetting of the phase of ongoing intrinsic neural oscillations. Here, subdural recordings evaluated whether phase resetting by auditory inputs would impact multisensory integration processes in human(More)
It is well established that sounds can enhance visual-target detection, but the mechanisms that govern these cross-sensory effects, as well as the neural pathways involved, are largely unknown. Here, we tested behavioral predictions stemming from the neurophysiologic and neuroanatomic literature. Participants detected near-threshold visual targets presented(More)
— The present study investigated the feasibility of acquiring electroencephalography (EEG) data during self-motion in human subjects. Subjects performed a visual oddball task – designed to evoke a P3 event-related potential – while being passively moved in the fore-aft direction on a Stewart platform. The results of this study indicate that reliable EEG(More)
The frequency of environmental vibrations is sampled by two of the major sensory systems, audition and touch, notwithstanding that these signals are transduced through very different physical media and entirely separate sensory epithelia. Psychophysical studies have shown that manipulating frequency in audition or touch can have a significant cross-sensory(More)