Dirk Wildgruber

Learn More
During acoustic communication among human beings, emotional information can be expressed both by the propositional content of verbal utterances and by the modulation of speech melody (affective prosody). It is well established that linguistic processing is bound predominantly to the left hemisphere of the brain. By contrast, the encoding of emotional(More)
Functional magnetic resonance imaging (fMRI) was employed to determine areas of activation in the cerebellar cortex in 46 human subjects during a series of motor tasks. To reduce the variance due to differences in individual anatomy, a specific transformational procedure for the cerebellum was introduced. The activation areas for movements of lips, tongue,(More)
Aside from spoken language, singing represents a second mode of acoustic (auditory-vocal) communication in humans. As a new aspect of brain lateralization, functional magnetic resonance imaging (fMRI) revealed two complementary cerebral networks subserving singing and speaking. Reproduction of a non-lyrical tune elicited activation predominantly in the(More)
Damage to the anterior peri-intrasylvian cortex of the dominant hemisphere may give rise to a fairly consistent syndrome of articulatory deficits in the absence of relevant paresis of orofacial or laryngeal muscles (apraxia of speech, aphemia, or phonetic disintegration). The available clinical data are ambiguous with respect to the relevant lesion site,(More)
During acoustic communication in humans, information about a speaker's emotional state is predominantly conveyed by modulation of the tone of voice (emotional or affective prosody). Based on lesion data, a right hemisphere superiority for cerebral processing of emotional prosody has been assumed. However, the available clinical studies do not yet provide a(More)
BACKGROUND There are few data on the cerebral organization of motor aspects of speech production and the pathomechanisms of dysarthric deficits subsequent to brain lesions and diseases. The authors used fMRI to further examine the neural basis of speech motor control. METHODS AND RESULTS In eight healthy volunteers, fMRI was performed during syllable(More)
In a natural environment, non-verbal emotional communication is multimodal (i.e. speech melody, facial expression) and multifaceted concerning the variety of expressed emotions. Understanding these communicative signals and integrating them into a common percept is paramount to successful social behaviour. While many previous studies have focused on the(More)
This study was conducted to investigate the connectivity architecture of neural structures involved in processing of emotional speech melody (prosody). 24 subjects underwent event-related functional magnetic resonance imaging (fMRI) while rating the emotional valence of either prosody or semantics of binaurally presented adjectives. Conventional analysis of(More)
In addition to the propositional content of verbal utterances, significant linguistic and emotional information is conveyed by the tone of speech. To differentiate brain regions subserving processing of linguistic and affective aspects of intonation, discrimination of sentences differing in linguistic accentuation and emotional expressiveness was evaluated(More)
We determined the location, functional response profile, and structural fiber connections of auditory areas with voice- and emotion-sensitive activity using functional magnetic resonance imaging (fMRI) and diffusion tensor imaging. Bilateral regions responding to emotional voices were consistently found in the superior temporal gyrus, posterolateral to the(More)