Learn More
Sudden changes in the acoustic environment enhance perceptual processing of subsequent visual stimuli that appear in close spatial proximity. Little is known, however, about the neural mechanisms by which salient sounds affect visual processing. In particular, it is unclear whether such sounds automatically activate visual cortex. To shed light on this(More)
The current study employed a modified gambling task, in which probabilistic cues were provided to elicit positive or negative expectations. Event-related potentials (ERPs) to "final outcome" and "probabilistic cues" were analyzed. Difference waves between the negative condition and the corresponding positive condition were examined. The results confirm that(More)
It is widely reported that inverting a face dramatically affects its recognition. Previous studies have shown that face inversion increases the amplitude and delays the latency of the face-specific N170 component of the event-related potential (ERP) and also enhances the amplitude of the occipital P1 component (latency 100-132 ms). The present study(More)
A recent study in humans (McDonald et al., 2013) found that peripheral, task-irrelevant sounds activated contralateral visual cortex automatically as revealed by an auditory-evoked contralateral occipital positivity (ACOP) recorded from the scalp. The present study investigated the functional significance of this cross-modal activation of visual cortex, in(More)
Although the hypothesis that nestedness determines mutualistic ecosystem dynamics is accepted in general, results of some recent data analyses and theoretical studies have begun to cast doubt on the impact of nestedness on ecosystem stability. However, definite conclusions have not yet been reached because previous studies are mainly based on numerical(More)
Using measurements of event-related potentials (ERPs) during a facial recognition task, we aimed to investigate the facial inversion effect and the role of time-based attention in processing upright and inverted faces. We presented upright and inverted faces at the T2 (target 2) position using a rapid serial visual presentation paradigm. Our results(More)
Previous studies have shown differential event-related potentials (ERPs) to intensities of fearful facial expressions. There are indications that the eyes may be particularly relevant for the recognition of fearful expressions, even the amount of white sclera exposed above and on sides of the dark pupil could activate the amygdala response. To investigate(More)
Music conveys emotion by manipulating musical structures, particularly musical mode- and tempo-impact. The neural correlates of musical mode and tempo perception revealed by electroencephalography (EEG) have not been adequately addressed in the literature. This study used independent component analysis (ICA) to systematically assess spatio-spectral EEG(More)
This article reviews a series of experiments that combined behavioral and electrophysiological recording techniques to explore the hypothesis that salient sounds attract attention automatically and facilitate the processing of visual stimuli at the sound's location. This cross-modal capture of visual attention was found to occur even when the attracting(More)
Bodily state plays a critical role in our perception. In the present study, we asked the question whether and how bodily experience of weights influences time perception. Participants judged durations of a picture (a backpack or a trolley bag) presented on the screen, while wearing different weight backpacks or without backpack. The results showed that the(More)