Learn More
In this paper, we discuss why, in designing multiparty mediatedsystems, we should focus first on providing non-verbal cues whichare less redundantly coded in speech than those normally conveyedby video. We show how conveying one such cue, gaze direction, maysolve two problems in multiparty mediated communication andcollaboration: knowing who is talking to(More)
In this paper, we present an attentive windowing technique that uses eye tracking, rather than manual pointing, for focus window selection. We evaluated the performance of 4 focus selection techniques: eye tracking with key activation, eye tracking with automatic activation, mouse and hotkeys in a typing task with many open windows. We also evaluated a(More)
In this paper, we present TeleHuman, a cylindrical 3D display portal for life-size human telepresence. The TeleHuman 3D videoconferencing system supports 360 degree motion parallax as the viewer moves around the cylinder and optionally, stereoscopic 3D display of the remote person. We evaluated the effect of perspective cues on the conveyance of nonverbal(More)
In multi-agent, multi-user environments, users as well as agents should have a means of establishing who is talking to whom. In this paper, we present an experiment aimed at evaluating whether gaze directional cues of users could be used for this purpose. Using an eye tracker, we measured subject gaze at the faces of conversational partners during(More)
In this paper, we propose a tentative framework for the classification of Attentive Interfaces, a new category of user interfaces. An Attentive Interface is a user interface that dynamically prioritizes the information it presents to its users, such that information processing resources of both user and system are optimally distributed across a set of(More)
One of the problems with notification appliances is that they can be distracting when providing information not of immediate interest to the user. In this paper, we present AuraOrb, an ambient notification appliance that deploys progressive turn taking techniques to minimize notification disruptions. AuraOrb uses eye contact sensing to detect user interest(More)
In this paper, we present Paper Windows, a prototype windowing environment that simulates the use of digital paper displays. By projecting windows on physical paper, Paper Windows allows the capturing of physical affordances of paper in a digital world. The system uses paper as an input device by tracking its motion and shape with a Vicon Motion Capturing(More)
Flexible displays potentially allow for interaction styles that resemble those used in paper documents. Bending the display, e.g., to page forward, shows particular promise as an interaction technique. In this paper, we present an evaluation of the effectiveness of various bend gestures in executing a set of tasks with a flexible display. We discuss a study(More)
This paper introduces DisplayObjects, a rapid prototyping workbench that allows functional interfaces to be projected onto real 3D physical prototypes. DisplayObjects uses a Vicon motion capture system to track the location of physical models. 3D software renditions of the 3D physical model are then texture-mapped with interactive behavior and projected(More)