Ruth Campbell

Learn More
BACKGROUND Integrating information from the different senses markedly enhances the detection and identification of external stimuli. Compared with unimodal inputs, semantically and/or spatially congruent multisensory cues speed discrimination and improve reaction times. Discordant inputs have the opposite effect, reducing performance and slowing responses.(More)
Watching a speaker's lips during face-to-face conversation (lipreading) markedly improves speech perception, particularly in noisy conditions. With functional magnetic resonance imaging it was found that these linguistic visual cues are sufficient to activate auditory cortex in normal hearing individuals in the absence of auditory speech sounds. Two further(More)
Can the cortical substrates for the perception of face actions be distinguished when the superficial visual qualities of these actions are very similar? Two fMRI experiments are reported. Compared with watching the face at rest, observing silent speech was associated with bilateral activation in a number of temporal cortical regions, including the superior(More)
Batrachochytrium dendrobatidis is a fungus belonging to the Phylum Chytridiomycota, Class Chytridiomycetes, Order Chytridiales, and is the highly infectious aetiological agent responsible for a potentially fatal disease, chytridiomycosis, which is currently decimating many of the world's amphibian populations. The fungus infects 2 amphibian orders (Anura(More)
BACKGROUND We assessed motion processing in a group of high functioning children with autism and a group of typically developing children, using a coherent motion detection task. METHOD Twenty-five children with autism (mean age 11 years, 8 months) and 22 typically developing children matched for non-verbal mental ability and chronological age were(More)
Speech is perceived both by ear and by eye. Unlike heard speech, some seen speech gestures can be captured in stilled image sequences. Previous studies have shown that in hearing people, natural time-varying silent seen speech can access the auditory cortex (left superior temporal regions). Using functional magnetic resonance imaging (fMRI), the present(More)
Integrating information across the senses can enhance our ability to detect and classify stimuli in the environment. For example, auditory speech perception is substantially improved when the speaker's face is visible. In an fMRI study designed to investigate the neural mechanisms underlying these crossmodal behavioural gains, bimodal (audio-visual) speech(More)
Children with autistic spectrum disorder and controls performed tasks of coherent motion and form detection, and motor control. Additionally, the ratio of the 2nd and 4th digits of these children, which is thought to be an indicator of foetal testosterone, was measured. Children in the experimental group were impaired at tasks of motor control, and had(More)
One of the most commonly cited examples of human multisensory integration occurs during exposure to natural speech, when the vocal and the visual aspects of the signal are integrated in a unitary percept. Audiovisual association of facial gestures and vocal sounds has been demonstrated in nonhuman primates and in prelinguistic children, arguing for a(More)
Recent findings suggest that children with autism may be impaired in the perception of biological motion from moving point-light displays. Some children with autism also have abnormally high motion coherence thresholds. In the current study we tested a group of children with autism and a group of typically developing children aged 5 to 12 years of age on(More)