Jared M J Berman

Learn More
An eye tracking methodology was used to evaluate 3- and 4-year-old children's sensitivity to speaker affect when resolving referential ambiguity. Children were presented with pictures of three objects on a screen (including two referents of the same kind, e.g., an intact doll and a broken doll, and one distracter item), paired with a prerecorded(More)
We examined whether preschoolers' ontological knowledge would influence lexical extension. In Experiment 1, four-year-olds were presented with a novel label for either an object with eyes described as an animal, or the same object without eyes described as a tool. In the animal condition, children extended the label to similar-shaped objects, whereas in the(More)
An eye-tracking methodology was used to examine the time course of 3- and 5-year-olds' ability to link speech bearing different acoustic cues to emotion (i.e., happy-sounding, neutral, and sad-sounding intonation) to photographs of faces reflecting different emotional expressions. Analyses of saccadic eye movement patterns indicated that, for both 3- and(More)
In three experiments, we investigated 5-year-olds' sensitivity to speaker vocal affect during referential interpretation in cases where the indeterminacy is or is not resolved by speech information. In Experiment 1, analyses of eye gaze patterns and pointing behaviours indicated that 5-year-olds used vocal affect cues at the point where an ambiguous(More)
Successful communication requires the recognition of the intentions that underlie language use. One relevant cue is the vocal affect that often accompanies speech. For example, " What a day " means something very different if spoken with negative frustrated-sounding vs. happy-sounding vocal affect. Previous research has suggested young children may not(More)
We have developed a new software application, Eye-gaze Language Integration Analysis (ELIA), which allows for the rapid integration of gaze data with spoken language input (either live or prerecorded). Specifically, ELIA integrates E-Prime output and/or .csv files that include eye-gaze and real-time language information. The process of combining eye(More)
Two experiments examined whether 5-year-olds draw inferences about desire outcomes that constrain their online interpretation of an utterance. Children were informed of a speaker's positive (Experiment 1) or negative (Experiment 2) desire to receive a specific toy as a gift before hearing a referentially ambiguous statement ("That's my present") spoken with(More)
Two experiments examined 4- and 5-year-olds' use of vocal affect to learn new words. In Experiment 1 (n = 48), children were presented with two unfamiliar objects, first in their original state and then in an altered state (broken or enhanced). An instruction produced with negative, neutral, or positive affect, directed children to find the referent of a(More)
  • 1