Learn More
Can linguistic semantics affect neural processing in feature-specific visual regions? Specifically, when we hear a sentence describing a situation that includes motion, do we engage neural processes that are part of the visual perception of motion? How about if a motion verb was used figuratively, not literally? We used fMRI to investigate whether semantic(More)
Positron emission tomography was used to investigate whether signed languages exhibit the same neural organization for lexical retrieval within classical and non-classical language areas as has been described for spoken English. Ten deaf native American sign language (ASL) signers were shown pictures of unique entities (famous persons) and non-unique(More)
Recognition of emotional facial expressions is universal for all humans, but signed language users must also recognize certain non-affective facial expressions as linguistic markers. fMRI was used to investigate the neural systems underlying recognition of these functionally distinct expressions, comparing deaf ASL signers and hearing nonsigners. Within the(More)
Positron emission tomography was used to investigate whether the motor-iconic basis of certain forms in American Sign Language (ASL) partially alters the neural systems engaged during lexical retrieval. Most ASL nouns denoting tools and ASL verbs referring to tool-based actions are produced with a handshape representing the human hand holding a tool and(More)
We examined word-level reading circuits in skilled deaf readers whose primary language is American Sign Language, and hearing readers matched for reading ability (college level). During fMRI scanning, participants performed a semantic decision (concrete concept?), a phonological decision (two syllables?), and a false-font control task (string underlined?).(More)
A [(15)O]water PET experiment was conducted to investigate the neural regions engaged in processing constructions unique to signed languages: classifier predicates in which the position of the hands in signing space schematically represents spatial relations among objects. Ten deaf native signers viewed line drawings depicting a spatial relation between two(More)
Rather than specifying spatial relations with a closed-class set of prepositions, American Sign Language (ASL) encodes spatial relations using space itself via classifier constructions. In these constructions, handshape morphemes specify object type, and the position of the hands in signing space schematically represents the spatial relation between(More)
Bimodal bilinguals are hearing individuals who know both a signed and a spoken language. Effects of bimodal bilingualism on behavior and brain organization are reviewed, and an fMRI investigation of the recognition of facial expressions by ASL-English bilinguals is reported. The fMRI results reveal separate effects of sign language and spoken language(More)
Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX(More)
To investigate the impact of sensory-motor systems on the neural organization for language, we conducted an H2 (15)O-PET study of sign and spoken word production (picture-naming) and an fMRI study of sign and audio-visual spoken language comprehension (detection of a semantically anomalous sentence) with hearing bilinguals who are native users of American(More)