Augmented, Mixed, and Virtual Reality Enabling of Robot Deixis

@inproceedings{Williams2018AugmentedMA,
  title={Augmented, Mixed, and Virtual Reality Enabling of Robot Deixis},
  author={T. Williams and Nhan Tran and Josh Rands and Neil T. Dantam},
  booktitle={HCI},
  year={2018}
}
When humans interact with each other, they often make use of deictic gestures such as pointing to help pick out targets of interest to their conversation. In the field of Human-Robot Interaction, research has repeatedly demonstrated the utility of enabling robots to use such gestures as well. Recent work in augmented, mixed, and virtual reality stands to enable enormous advances in robot deixis, both by allowing robots to gesture in ways that were not previously feasible, and by enabling… 
Mixed Reality Deictic Gesture for Multi-Modal Robot Communication
TLDR
This paper investigates human perception of videos simulating the display of allocentric gestures, in which robots circle their targets in users' fields of view, and suggests that this is an effective communication strategy, both in terms of objective accuracy and subjective perception.
Towards an Understanding of Physical vs Virtual Robot Appendage Design
Augmented Reality (AR) or Mixed Reality (MR) enables innovative interactions by overlaying virtual imagery over the physical world. For roboticists, this creates new opportunities to apply proven
Using Augmented Reality to Better Study Human-Robot Interaction
TLDR
This paper discusses how Augmented Reality can be used to address concerns while increasing researchers’ level of experimental control, and discusses both advantages and disadvantages of this approach.
Augmented Reality for Human-Robot Teaming in Field Environments
TLDR
It is argued that current AR technology combined with novel approaches can enable successful teaming in such challenging, real-world settings and is presented with a set of prototypes that combine AR with an intelligent, autonomous robot to enable better human-robotteaming in field environments.
Investigating the Potential Effectiveness of Allocentric Mixed Reality Deictic Gesture
TLDR
This work experimentally investigates the potential utility of allocentric gestures, in which circles or arrows are rendered to enable human teammates to pick out the robot’s target referents, and suggests thatallocentric gestures should be used to complement rather than replace complex referring expressions.
Exploring Interaction Design Considerations for Trustworthy Language-Capable Robotic Wheelchairs in Virtual Reality
TLDR
The design of an experiment examining the importance of proactive communication by robotic wheelchairs, as compared to non-vehicular mobile robots, within a Virtual Reality (VR) environment is presented.
Dynamic Path Visualization for Human-Robot Collaboration
TLDR
A dynamic path visualizer is developed that projects the robot's motion intent at varying lengths depending on the complexity of the upcoming path and reveals participants preference towards visuals that show longer path projections.
Toward Allocentric Mixed-Reality Deictic Gesture
TLDR
This paper provides additional evidence for the hypothesis that robots that use physical deictic gestures such as pointing enable more effective and natural interaction through a second experiment that addresses potential confounds from the original study.
Mixed Reality as a Bidirectional Communication Interface for Human-Robot Interaction
TLDR
The Physio-Virtual Deixis Partially Observable Markov Decision Process (PVD-POMDP) is proposed, which interprets multimodal observations from the human and decides when and how to ask questions in order to recover from failure states and cope with sensor noise.
Exploring Augmented Reality Interaction for Everyday Multipurpose Wearable Robots
TLDR
This work presents a framework for integrating augmented reality (AR) and multipurpose wearable robots, which uses the publisher-subscriber model to expose different robot functionalities as services on a network to be invoked by the AR system.
...
...

References

SHOWING 1-10 OF 77 REFERENCES
A Framework for Robot-Generated Mixed-Reality Deixis
TLDR
This paper has presented a conceptual framework for categorizing different types of mixed-reality deictic gestures that may be generated by robots in human-robot interaction scenarios, and presented an analysis of how these categories differ along a variety of dimensions.
Communicating Robot Motion Intent with Augmented Reality
TLDR
A new design space for communicating robot motion in-tent is explored by investigating how augmented reality (AR) might mediate human-robot interactions and developing a series of explicit and implicit designs for visually signaling robot motion intent using AR.
Improving Collocated Robot Teleoperation with Augmented Reality
TLDR
It is explored how advances in augmented reality (AR) technologies are creating a new design space for mediating robot teleoperation by enabling novel forms of intuitive, visual feedback and several objective and subjective performance benefits over existing systems.
A Hands-Free Virtual-Reality Teleoperation Interface for Wizard-of-Oz Control
TLDR
This paper proposes a WoZ teleoperation interface that pairs a VR display with technologies for hands-free robot control in order to address those challenges while providing an immersive VR experience for robot teleoperators.
Visual Hints for Tangible Gestures in Augmented Reality
TLDR
This work investigates a variety of representations of visual hints in AR of potential actions and their consequences in the augmented physical world, and describes a specific implementation that supports gestures developed for a tangible AR user interface to an electronic field guide for botanists.
The implementation of augmented reality in a robotic teleoperation system
TLDR
This project extended Wheelchair Mounted Robotic Manipulators working range by making such system teleoperatable and enhanced its efficiency by providing more natural and intuitive method for manipulation.
TouchMe : An Augmented Reality Based Remote Robot Manipulation
TLDR
The TouchMe system allows the user to manipulate each part of the robot by directly touching it on a view of the world as seen by a camera looking at the robot from a third-person view, which provides intuitive operation.
Mobile Mixed-Reality Interfaces That Enhance Human–Robot Interaction in Shared Spaces
TLDR
Results indicate that, despite the suitability of the conventional approaches in remote applications, the proposed interface approach provides comparable task performance and user experiences in shared spaces without the need to install operator stations or vision systems on or around the robot.
Human Robot Collaboration: An Augmented Reality Approach—A Literature Review and Analysis
Future space exploration will demand the cultivation of human-robotic systems, however, little attention has been paid to the development of human-robot teams. Current methods for autonomous plan
Mediating Human-Robot Collaboration through Mixed Reality Cues
TLDR
It was found that projecting visual cues enabled human subjects to collaborate more effectively with the robot and resulted in higher efficiency in completing the task.
...
...