• Publications
  • Influence
Anticipatory Biasing of Visuospatial Attention Indexed by Retinotopically Specific α-Bank Electroencephalography Increases over Occipital Cortex
TLDR
Alpha-band (8-14 Hz) oscillatory EEG activity was examined with high-density scalp electrical recording during the cue-stimulus interval of an endogenous spatial cueing paradigm and results are consistent with active gating of uncued spatial locations.
Increases in alpha oscillatory power reflect an active retinotopic mechanism for distracter suppression during sustained visuospatial attention.
TLDR
Bilateral flickering stimuli were presented simultaneously and continuously over entire trial blocks, such that externally evoked alpha desynchronization is equated in precue baseline and postcue intervals and suggests that alpha synchronization reflects an active attentional suppression mechanism, rather than a passive one reflecting "idling" circuits.
Attentional Selection in a Cocktail Party Environment Can Be Decoded from Single-Trial EEG.
TLDR
It is shown that single-trial unaveraged EEG data can be decoded to determine attentional selection in a naturalistic multispeaker environment and a significant correlation between the EEG-based measure of attention and performance on a high-level attention task is shown.
The role of cingulate cortex in the detection of errors with and without awareness: a high‐density electrical mapping study
TLDR
The data suggest that the ACC might participate in both preconscious and conscious error detection and that cortical arousal provides a necessary setting condition for error awareness.
The Role of Alpha-Band Brain Oscillations as a Sensory Suppression Mechanism during Selective Attention
TLDR
Findings in the context of intersensory selective attention as well as intrasensory spatial and feature-based attention in the visual, auditory, and tactile domains are discussed.
Flow of activation from V1 to frontal cortex in humans
TLDR
Data suggest that activity represented in the "early" ERP components such as P1 and N1 (and possibly even C1) is likely to reflect relatively late processing, after the initial volley of sensory afference through the visual system and involving top-down influences from parietal and frontal regions.
Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments.
TLDR
It is contended that the multisensory speech system is maximally tuned for SNRs between extremes, where the system relies on either the visual (speech-reading) or the auditory modality alone, forming a window of maximal integration at intermediate SNR levels.
Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment.
TLDR
Psychophysical and electrophysiological indices are used to show that auditory-somatosensory interactions in humans occur via the same early sensory mechanism both when stimuli are in and out of spatial register, and across wide peripersonal spatial separations remarkably early in sensory processing and in cortical regions traditionally considered unisensory.
The Spatiotemporal Dynamics of Illusory Contour Processing: Combined High-Density Electrical Mapping, Source Analysis, and Functional Magnetic Resonance Imaging
TLDR
It is proposed that IC sensitivity described in V2 and V1 may reflect predominantly feedback modulation from higher-tier LOC areas, where IC sensitivity first occurs, and two additional observations further support this proposal.
...
...