Learn More
We introduce a new problem domain for activity recognition: the analysis of children's social and communicative behaviors based on video and audio data. We specifically target interactions between children aged 1-2 years and an adult. Such interactions arise naturally in the diagnosis and treatment of developmental disorders such as autism. We introduce a(More)
The recent emergence of comfortable wearable sensors has focused almost entirely on monitoring physical activity, ignoring opportunities to monitor more subtle phenomena, such as the quality of social interactions. We argue that it is compelling to address whether physiological sensors can shed light on quality of social interactive behavior. This work(More)
Behavioral imaging encompasses the use of computational sensing and modeling techniques to measure and analyze human behavior. This article discusses a research program focused on the study of dyadic social interactions between children and their caregivers and peers. The study has resulted in a dataset containing semi-structured play interactions between(More)
We describe a system for detecting moments of eye contact between an adult and a child, based on a single pair of gaze-tracking glasses which are worn by the adult. Our method utilizes commercial gaze tracking technology to determine the adult's point of gaze, and combines this with computer vision analysis of video of the child's face to determine their(More)
Severe behavior problems of children with developmental disabilities often require intervention by specialists. These specialists rely on direct observation of the behavior, usually in a controlled clinical environment. In this paper, we present a technique for using on-body accelerometers to assist in automated classification of problem behavior during(More)
We report on a nine-month-long observational study with teachers and students with autism in a classroom setting. We explore the impact of motion-based activities on students' behavior. In particular, we examine how the playful gaming activity impacted students' engagement, peer-directed social behaviors, and motor skills. We document the effectiveness of a(More)
We examined facial electromyography (fEMG) activity to dynamic, audio-visual emotional displays in individuals with autism spectrum disorders (ASD) and typically developing (TD) individuals. Participants viewed clips of happy, angry, and fearful displays that contained both facial expression and affective prosody while surface electrodes measured corrugator(More)
The Variable Time-Shift Hidden Markov Model (VTS-HMM) is proposed for learning and modeling pairs of correlated streams. Unlike previous coupled models for time series, the VTS-HMM accounts for varying time shifts between correlated events in pairs of streams having different properties. The VTS-HMM is learned on a set of pairs of unaligned streams and,(More)
— We propose a system for detecting bids for eye contact directed from a child to an adult who is wearing a point-of-view camera. The camera captures an egocentric view of the child-adult interaction from the adult's perspective. We detect and analyze the child's face in the egocentric video in order to automatically identify moments in which the child is(More)
In this work we analyze the expressive manifestation of a child's engagement behavior on his speech as well as in the speech of psychologist interacting with the child. Visual cues such as facial gestures and gaze are known to be informative of engagement, but here, we examine the less studied speech cues of the children's non-verbal vocalizations. We study(More)