Learn More
In multi-agent, multi-user environments, users as well as agents should have a means of establishing who is talking to whom. In this paper, we present an experiment aimed at evaluating whether gaze directional cues of users could be used for this purpose. Using an eye tracker, we measured subject gaze at the faces of conversational partners during(More)
In this paper, we discuss why, in designing multiparty mediatedsystems, we should focus first on providing non-verbal cues whichare less redundantly coded in speech than those normally conveyedby video. We show how conveying one such cue, gaze direction, maysolve two problems in multiparty mediated communication andcollaboration: knowing who is talking to(More)
In this paper, we present an attentive windowing technique that uses eye tracking, rather than manual pointing, for focus window selection. We evaluated the performance of 4 focus selection techniques: eye tracking with key activation, eye tracking with automatic activation, mouse and hotkeys in a typing task with many open windows. We also evaluated a(More)
This paper introduces DisplayObjects, a rapid prototyping workbench that allows functional interfaces to be projected onto real 3D physical prototypes. DisplayObjects uses a Vicon motion capture system to track the location of physical models. 3D software renditions of the 3D physical model are then texture-mapped with interactive behavior and projected(More)
In this paper, we propose a tentative framework for the classification of Attentive Interfaces, a new category of user interfaces. An Attentive Interface is a user interface that dynamically prioritizes the information it presents to its users, such that information processing resources of both user and system are optimally distributed across a set of(More)
In this paper, we present TeleHuman, a cylindrical 3D display portal for life-size human telepresence. The TeleHuman 3D videoconferencing system supports 360 degree motion parallax as the viewer moves around the cylinder and optionally, stereoscopic 3D display of the remote person. We evaluated the effect of perspective cues on the conveyance of nonverbal(More)
We present ECSGlasses: wearable eye contact sensing glasses that detect human eye contact. ECSGlasses report eye contact to digital devices, appliances and EyePliances in the user's <i>attention space</i>. Devices use this attentional cue to engage in a more sociable process of turn taking with users. This has the potential to reduce inappropriate(More)
One of the problems with notification appliances is that they can be distracting when providing information not of immediate interest to the user. In this paper, we present AuraOrb, an ambient notification appliance that deploys progressive turn taking techniques to minimize notification disruptions. AuraOrb uses eye contact sensing to detect user interest(More)