A Distributed Wearable System Based on Multimodal Fusion

Abstract

Wearable computer can be worn anytime with the support of unrestricted communications and variety of services which provides maximum capability of information use. Key challenges in developing such wearable computers are level of comfort that users do not feel what they wear, easy and intuitive user interface and power management technique. This paper suggests a wearable system that consists of a wristwatch-type gesture recognition device and a personal mobile gateway that functional input/output modules can be freely plugged in and taken out. We describe our techniques implemented during our wearable system development: 1) multimodal fusion engine that recognizes voice and gesture simultaneously, 2) power management technique and 3) gesture recognition engine. Finally, we evaluate the performance of our multi-modal fusion engine, and show the power consumption measurement data of our system built with the power management technique.

DOI: 10.1007/978-3-540-72685-2_35

Extracted Key Phrases

8 Figures and Tables

Showing 1-10 of 11 references

Intel PXA27x Processor Family Developer's Manual

  • 2006

MX21 Applications Processor Reference Manual, MC9238MX21RM, Rev.2 (2005) 5. ±2g Tri-Axis Digital Accelerometer Specifications, Part Number

  • 2006

A systematic approach to the design of distributed wearable systems

  • U Anliker
  • 2004

Design of Open Architecture Real-Time OS Kernel. KISS

  • H S Park
  • 2002

A Survey of Hand Posture and Gesture Recognition Techniques and Technology

  • J Laviola
  • 1999