Siyi Zheng

  • Citations Per Year
Learn More
Traditional intelligent CAI (Computer Aided Instruction) pays less attention to emotional cognition. To address this problem, an emotional pedagogical system based on artificial psychology and Agent theories is presented. Student’s emotion is considered as one of the secondary index in two-layer fuzzy comprehensive assessment model to assess study(More)
At present, it is very urgent to protect the juniors from the negative influence of the internet. This paper aims at the automated Juvenile detection based on facial appearance as the gateway for the access of adult websites. To describe the facial appearance, local binary patterns (LBP) are applied to extract the local texture from three main facial(More)
This emotion based intelligent tutoring system for English instruction is designed to make up the lack of emotional factors in cognitive ability of traditional intelligent tutoring systems. Microsoft Agent is used as interactive platform in the system. Audio and image information is used as interactive methods. Result of emotional face image recognition is(More)
With the rapid development of sensor, navigation, control and artificial intelligence, human society has inevitably entered the era of intelligent driving, which may have a chance to release us from low-level tedious driving activities, hence create an ultimate driving experience. There are many topics of intelligent driving, software architecture is among(More)
In order to enable personalized natural interaction in service robots, artificial emotion is needed which helps robots to appear as individuals. In the emotion modeling theory of emotional Markov chain model (eMCM) for spontaneous transfer and emotional hidden Markov model (eHMM) for stimulated transfer, there are three problems: 1) Emotion distinguishing(More)
This paper addresses the problem of automated analysis of gaze states (fixation or motion). First, we present a novel dynamic and multi-level analysis structure for the detection of gaze states based on eye action units (AUs). Then the eye action units are analyzed both in spatial and temporal domain. And finally these eye AUs form the input to infer eye(More)
  • 1