Learn More
Access to well-labeled recordings of facial expression is critical to progress in automated facial expression recognition. With few exceptions, publicly available databases are limited to posed facial behavior that can differ markedly in conformation, intensity, and timing from what occurs spontaneously. To meet the need for publicly available corpora of(More)
This paper presents a framework to automatically measure the intensity of naturally occurring facial actions. Naturalistic expressions are non-posed spontaneous actions. The facial action coding system (FACS) is the gold standard technique for describing facial expressions, which are parsed as comprehensive, nonoverlapping action units (Aus). AUs have(More)
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two six-month-old/mother dyads who each engaged in a face-to-face interaction.(More)
In this paper, we present an approach for 3D face recognition from frontal range data based on the ridge lines on the surface of the face. We use the principal curvature, k max , to represent the face image as a 3D binary image called ridge image. The ridge image shows the locations of the ridge points around the important facial regions on the face (i.e.,(More)
Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were(More)
This paper presents an approach for measuring and monitoring human body joint angles using inertial measurement unit (IMU) sensors. This type of monitoring is beneficial for therapists and physicians because it facilitates remote assessment of patient activities. In our approach, two IMUs are mounted on the upper leg and the lower leg to measure the Euler(More)
Automated Facial Expression Recognition (FER) has remained a challenging and interesting problem in computer vision. Despite efforts made in developing various methods for FER, existing approaches lack generalizability when applied to unseen images or those that are captured in wild setting (i.e. the results are not significant). Most of the existing(More)
We investigated the dynamics of head motion in parents and infants during an age-appropriate, well-validated emotion induction, the Face-to-Face/Still-Face procedure. Participants were 12 ethnically diverse 6-month-old infants and their mother or father. During infant gaze toward the parent, infant angular amplitude and velocity of pitch and yaw decreased(More)