Towards online and personalized daily activity recognition, habit modeling, and anomaly detection for the solitary elderly through unobtrusive sensing
A two-stage action recognition approach for detecting arm gesture related to human eating or drinking is proposed in this paper. Information retrieved from such a system can be used in the domain of daily life surveillance. We demonstrate that eating or drinking actions can be featured and detected using wearable inertial sensors only. The proposed approach has two steps: feature extraction and classification. The arm movement is the main features of the eating activity. Thus the first step is to extract features from the arm movement raw data. The movement kinematics model for feature extraction in 3D space is firstly built up based on Eular angles. Extended Kalman filter (EKF) is applied to extract the features from the eating action information in a three dimensional space in real time. The second step is the classification. The hierarchical temporal memory (HTM) network is adopted to classify the extracted features of the eating action based on the space and time varying property of the features. The advantages for the HTM algorithm used for classification is that it not only can classify the statistic actions but also can deal with the dynamic signals which is varying with both of the space and time. The HTM can perform high accuracy for the dynamic action detection. The proposed approach is tested through the real eating and drinking action by using the 3-D accelerometer. The experimental results show that the HTM and EKF based method can perform the action recognition with very high accuracy.