Learn More
Access to well-labeled recordings of facial expression is critical to progress in automated facial expression recognition. With few exceptions, publicly available databases are limited to posed facial behavior that can differ markedly in conformation, intensity, and timing from what occurs spontaneously. To meet the need for publicly available corpora of(More)
This paper presents a framework to automatically measure the intensity of naturally occurring facial actions. Naturalistic expressions are non-posed spontaneous actions. The facial action coding system (FACS) is the gold standard technique for describing facial expressions, which are parsed as comprehensive, nonoverlapping action units (Aus). AUs have(More)
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two six-month-old/mother dyads who each engaged in a face-to-face interaction.(More)
Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were(More)
In this paper, we present an approach for 3D face recognition from frontal range data based on the ridge lines on the surface of the face. We use the principal curvature, k max , to represent the face image as a 3D binary image called ridge image. The ridge image shows the locations of the ridge points around the important facial regions on the face (i.e.,(More)
This paper presents an approach for measuring and monitoring human body joint angles using inertial measurement unit (IMU) sensors. This type of monitoring is beneficial for therapists and physicians because it facilitates remote assessment of patient activities. In our approach, two IMUs are mounted on the upper leg and the lower leg to measure the Euler(More)
Get-Up-and-Go Test is commonly used for assessing the physical mobility of the elderly by physicians. This paper presents a method for automatic analysis and classification of human gait in the Get-Up-and-Go Test using a Microsoft Kinect sensor. Two types of features are automatically extracted from the human skeleton data provided by the Kinect sensor. The(More)
Automated Facial Expression Recognition (FER) has remained a challenging and interesting problem in computer vision. Despite efforts made in developing various methods for FER, existing approaches lack generalizability when applied to unseen images or those that are captured in wild setting (i.e. the results are not significant). Most of the existing(More)