Learn More
In computer animation and interactive computer games, gesture and speech modality can be a powerful interface between humans and computers. In this paper, we propose a personal digital assistant (PDA)- based multi-modal network game interface using speech, gesture and touch sensations. To verify the validity of our approach, we implement a multi-modal omok(More)
This paper describes a PDA-based MMCR (Multi- Modal Command Recognizer for PDA control and handling) using double-touching with a finger by coupling embedded speech and KSSL recognizer, and suggests an improved synchronization method between multi-modalities for simultaneous multi-modality, for a pattern recognition-based neo multi-modal HCI. The MMCR fuses(More)
There have been many recent studies on Gaze Direction Recognition System in the field of HCI (Human Computer Interaction). This system will be the most natural and intuitive HCI system due to the application of gaze direction or biomedical Signals. This paper proposes a multi-modal gaze direction recognition system using the nine directional gaze(More)
In this system we improved the performance of an automatic system for extracting leaf contour. The proposed leaf contour extraction method consists of three major procedures: the detection of four edge points, and contour tracing. Leaf detection includes two stages: feature extraction and matching. For the leaf contour extraction part, we present a new(More)
There have been many recent studies on gaze recognition system in the field of HCI (Human Computer Interaction). This system will be the most natural and intuitive HCI system due to the application of gaze direction or biomedical Signals. We propose a multimodal user interface system using the nine directional gaze recognition based on image, EOG(More)
  • 1