The Effects of Social Gaze in Human-Robot Collaborative Assembly

@inproceedings{Fischer2015TheEO,
  title={The Effects of Social Gaze in Human-Robot Collaborative Assembly},
  author={Kerstin Fischer and Lars Christian Jensen and Franziska Kirstein and Sebastian Stabinger and {\"O}zg{\"u}r Erkent and Dadhichi Shukla and Justus H. Piater},
  booktitle={ICSR},
  year={2015}
}
In this paper we explore how social gaze in an assembly robot affects how naive users interact with it. In a controlled experimental study, 30 participants instructed an industrial robot to fetch parts needed to assemble a wooden toolbox. Participants either interacted with a robot employing a simple gaze following the movements of its own arm, or with a robot that follows its own movements during tasks, but which also gazes at the participant between instructions. Our qualitative and… 
Between legibility and contact: The role of gaze in robot approach
TLDR
The behavioral results show that users are significantly more at ease with the robot that gazes at them than with the Robot that looks where it is going, measured by the number of instances of glances away from the robot.
Design of Effective Robotic Gaze-Based Social Cueing for Users in Task-Oriented Situations: How to Overcome In-Attentional Blindness?
Robotic eye-gaze-based cueing has been studied and proved to be effective, in controlled environments, in achieving social functions as humans gaze. However, its dynamic adaptability in various real
Robot Gaze Behaviors in Human-to-Robot Handovers
TLDR
This work identified four receiver gaze behaviors of a robot receiving an object from a human: gazing at the giver's hand, gazing at their face, and two kinds of face-hand transition gazes, and implemented these behaviors on a robot arm equipped with an anthropomorphic head.
Proactive, incremental learning of gesture-action associations for human-robot collaboration
TLDR
A fast, supervised learning framework for learning associations between human hand gestures and the intended robotic manipulation actions is proposed that enables the robot to learn associations on the fly while performing a task with the user.
Human Preferences for Robot Eye Gaze in Human-to-Robot Handovers
This paper investigates human’s preferences for a robot’s eye gaze behavior during human-to-robot handovers. We studied gaze patterns for all three phases of the handover process: reach, transfer,
Towards efficient human–machine collaboration: effects of gaze-driven feedback and engagement on performance
TLDR
This work investigates if and how the gaze behavior of a human interaction partner can be used by a gaze-aware assistance system to improve referential success and suggests that listeners engage more intensely with the system when they can expect it to be cooperative.
On the Imitation of Goal Directed Movements of a Humanoid Robot
TLDR
The results show that people areresponsive to a robot’s social gaze cues, and that they are responsive to the action goals of robots, although not as much as in HHI.
On the Attempt to Implement Social Addressability within a Robotic System
TLDR
The authors' findings suggest a potential relevance of social address for the interaction partner to receive additional information, especially if the situation is a contingent one.
Learning Semantics of Gestural Instructions for Human-Robot Collaboration
TLDR
This work presents the fast, supervised Proactive Incremental Learning (PIL) framework for learning associations between human hand gestures and the intended robotic manipulation actions, and investigates how the accuracy of gesture detection affects the number of interactions required to complete the task.
Augmenting Situated Spoken Language Interaction with Listener Gaze
TLDR
A proof of concept that listener gaze can effectively be used in situated human-machine interaction is provided and its impact on prediction of reference resolution using a mulitimodal corpus collection from virtual environments is evaluated.
...
1
2
3
...

References

SHOWING 1-10 OF 23 REFERENCES
A survey of social gaze
TLDR
A novel behavioral definition as a mapping G = E(C) from the perception of a social context C to a set of head, eye, and body patterns called gaze acts G that expresses the engagement E is proposed, providing a guide for principled future implementations of social gaze.
Conversational Gaze Aversion for Humanlike Robots
TLDR
This work presents a system that addresses the challenges of adapting human gaze aversion movements to a robot with very different affordances, such as a lack of articulated eyes, and autonomously generates and combines three distinct types of robot head movements with different purposes: face-tracking movements to engage in mutual gaze, idle head motion to increase lifelikeness, and purposeful gaze aversions to achieve conversational functions.
Effects of Different Kinds of Robot Feedback
TLDR
The results show that only in the condition in which the robot's behavior is socially contingent, the human tutors adjust their behavior to the robot.
Initiating interactions in order to get help: Effects of social framing on people's responses to robots' requests for assistance
TLDR
The results show that social framing, in contrast to other methods for getting a person's continued attention, is effective and increases how friendly the robot appears, however, it has little influence on people's willingness to assist the robot.
Breaking the Ice in Human-Agent Communication: Eye-Gaze Based Initiation of Contact with an Embodied Conversational Agent
TLDR
An eye-gaze based model of interaction is implemented to investigate whether flirting tactics help improve first encounters between a human and an agent and investigate which non-verbal signals an agent should convey in order to create a favourable atmosphere for subsequent interactions and increase the user's willingness to engage in an interaction with the agent.
Meet Me where I’m Gazing: How Shared Attention Gaze Affects Human-Robot Handover Timing
TLDR
Empirical evidence is provided that using humanlike gaze cues during human-robot handovers can improve the timing and perceived quality of the handover event, and indicates that subjects reach for the offered object significantly earlier when a robot provides a shared attention gaze cue during a handover.
Deliberate Delays During Robot-to-Human Handovers Improve Compliance With Gaze Communication
TLDR
It is shown that a simple manipulation of a robot's handover behavior can significantly increase both awareness of the robot’s eye gaze and compliance with that gaze, and the handover delay increases peoples’ compliance with the robot's communication over a non-delayed handover, even when compliance results in counterintuitive behavior.
Conversational gaze mechanisms for humanlike robots
TLDR
Investigating people's use of key conversational gaze mechanisms, how they might be designed for and implemented in humanlike robots, and whether these signals effectively shape human-robot conversations finds that participants conformed to these intended roles 97% of the time.
Exploring a Model of Gaze for Grounding in Multimodal HRI
TLDR
A modeling approach is presented focusing on these multi-modal, parallel and bi-directional aspects of gaze that need to be considered for grounding and their interleaving with the dialog and task management.
...
1
2
3
...