Learn More
We present a multimodal media center interface designed for blind and partially sighted people. It features a zooming focus-plus-context graphical user interface coupled with speech output and haptic feedback. A multimodal combination of gestures, key input, and speech input is utilized to interact with the interface. The interface has been developed and(More)
Home environment is an exciting application domain for multimodal mobile interfaces. Instead of multiple remote controls, personal mobile devices could be used to operate home entertainment systems. This paper reports a subjective evaluation of multimodal inputs and outputs for controlling a home media center using a mobile phone. A within-subject(More)
We present a multimodal media center interface based on speech input, gestures, and haptic feedback (hapticons). In addition, the application includes a zoomable context + focus GUI in tight combination with speech output. The resulting interface is designed for and evaluated with different user groups, including visually and physically impaired users.(More)
In this paper, we present results from a long-term user pilot study of speech controlled media center. The pilot users in this case were physically disabled and the system was installed in their apartment for six weeks. We designed a multimodal media center interface based on speech. Full speech control is provided with a hands-free speech recognition input(More)
We demonstrate interaction with a multimodal media center application. Mobile phone-based interface includes speech and gesture input and haptic feedback. The setup resembles our long-term public pilot study, where a living room environment containing the application was constructed inside a local media museum allowing visitors to freely test the system.(More)
  • 1