A Distributed Wearable System Based on Multimodal Fusion

Abstract

Wearable computer can be worn anytime with the support of unrestricted communications and variety of services which provides maximum capability of information use. Key challenges in developing such wearable computers are level of comfort that users do not feel what they wear, easy and intuitive user interface and power management technique. This paper suggests a wearable system that consists of a wristwatch-type gesture recognition device and a personal mobile gateway that functional input/output modules can be freely plugged in and taken out. We describe our techniques implemented during our wearable system development: 1) multimodal fusion engine that recognizes voice and gesture simultaneously, 2) power management technique and 3) gesture recognition engine. Finally, we evaluate the performance of our multimodal fusion engine, and show the power consumption measurement data of our system built with the power management technique.

DOI: 10.1007/978-3-540-72685-2_35

Extracted Key Phrases

10 Figures and Tables

Cite this paper

@inproceedings{Cho2007ADW, title={A Distributed Wearable System Based on Multimodal Fusion}, author={Ilyeon Cho and John B Sunwoo and Hyun-Tae Jeong and Yong-Ki Son and Hee-Joong Ahn and Dong-Woo Lee and Dong-Won Han and Cheol-Hoon Lee}, booktitle={ICESS}, year={2007} }