A Gesture Based Interface for Human-Robot Interaction

@article{Waldherr2000AGB,
  title={A Gesture Based Interface for Human-Robot Interaction},
  author={Stefan Waldherr and Roseli Ap. Francelin Romero and Sebastian Thrun},
  journal={Autonomous Robots},
  year={2000},
  volume={9},
  pages={151-173}
}
Service robotics is currently a highly active research area in robotics, with enormous societal potential. Since service robots directly interact with people, finding “natural” and easy-to-use user interfaces is of fundamental importance. While past work has predominately focussed on issues such as navigation and manipulation, relatively few robotic systems are equipped with flexible user interfaces that permit controlling the robot by “natural” means. This paper describes a gesture interface… 

Gesture recognition with application to human-robot interaction

TLDR
This dissertation presents the design of a gestural interface that can be used to control a robot, consisting of two modes: far-mode and near-mode, and shows that the proposed system is robust to changes in skin colour and user hand size.

Gesture-based human-robot interaction for human assistance in manufacturing

TLDR
A gesture-based HRI framework in which a robot assists a human co-worker delivering tools and parts, and holding objects to/for an assembly operation, and a parameterization robotic task manager (PRTM), in which the co- worker selects/validates robot options using gestures are proposed.

Detection and Interpretation of Human Walking Gestures for Human-Robot Interaction

TLDR
The system is used to interpret human attitudes and uses a PTZ camera and a laser range finder mounted on a mobile robot platform to detect and track humans on interior environments.

Two-handed gesture recognition and fusion with speech to command a robot

TLDR
A flexible multimodal interface based on speech and gesture modalities in order to control the authors' mobile robot named Jido is described and a probabilistic and multi-hypothesis interpreter framework is shown to improve the classification rates of multi-modality commands compared to using either modality alone.

12 User , Gesture and Robot Behaviour Adaptation for Human-Robot Interaction

TLDR
The goal of human-robot interaction (HRI) research is to define a general human model that could lead to principles and algorithms allowing more natural and effective interaction between humans and robots.

A natural gesture interface for operating robotic systems

TLDR
An implementation of this gesture-based interaction framework for controlling mobile robots is presented in the control of an underwater robot by an on-site human operator and quantitative data collected from human participants indicating accuracy and performance are presented.

User, Gesture and Robot Behaviour Adaptation for Human-Robot Interaction

TLDR
The goal of human-robot interaction (HRI) research is to define a general human model that could lead to principles and algorithms allowing more natural and effective interaction between humans and robots.

Gesture-based human-robot interface for dual-robot with hybrid sensors

TLDR
The proposed interface can perform a long-term stable tracking of the two-hand gestures even if there is occlusion between the hands, and reduces the number of hand reset to reduce the operation time.

Gesture based human - Multi-robot swarm interaction and its application to an interactive display

TLDR
A taxonomy for gesture-based interaction between a human and a group (swarm) of robots is described and a depth sensor is used to recognize human gesture, determining the commands sent to a group comprising tens of robots.

Automatic gesture recognition for intelligent human-robot interaction

  • Seong-Whan Lee
  • Computer Science
    7th International Conference on Automatic Face and Gesture Recognition (FGR06)
  • 2006
TLDR
This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot, and it is successfully included and operated in a mobile robots.
...

References

SHOWING 1-10 OF 59 REFERENCES

Template-Based Recognition of Pose and Motion Gestures On a Mobile Robot

TLDR
A gesture-based interface for human robot interaction is described, which enables people to instruct robots through easy-to-perform arm gestures, using a hybrid approach that integrates neural networks and template matching.

Recognizing and Interpreting Gestures on a Mobile Robot

TLDR
This paper describes a real-time, three-dimensional gesture recognition system that resides on-board a mobile robot, capable of recognizing six distinct gestures made by an unadorned human in an unaltered environment, including the coarse model and the active vision approach.

Neural networks for gesture-based remote control of a mobile robot

We present a neural network architecture for gesture-based interaction between a mobile robot and its user, thereby spanning a bridge from the localisation of the user over the recognition of its

A gesture interface for human-robot-interaction

  • J. TrieschC. Malsburg
  • Computer Science, Art
    Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition
  • 1998
TLDR
A person-independent gesture interface implemented on a real robot which allows the user to give simple commands, e.g., how to grasp an object and where to put it, based on real-time tracking of the user's hand and refined analysis of the hand's shape in the presence of varying complex backgrounds.

Robotic Gesture Recognition

TLDR
Progress is reported on in the lab in the development of techniques for building robust gesture interfaces which can handle constraints in the context of robotics, especially in vision-based gesture interfaces.

Vision for man-machine interaction

  • J. Crowley
  • Computer Science
    Robotics Auton. Syst.
  • 1995

A mobile robot that recognizes people

TLDR
A robot system that finds people, approaches them and then recognizes them is described, which uses a variety of techniques: color vision is used to find people; vision and sonar sensors are used to approach them; and a template-based pattern recognition algorithm is usedto isolate the face.

Using stereo vision to pursue moving agents with a mobile robot

  • E. HuberD. Kortenkamp
  • Computer Science
    Proceedings of 1995 IEEE International Conference on Robotics and Automation
  • 1995
TLDR
This paper introduces the proximity space method as a means for performing real-time, behavior-based control of visual gaze and shows how this method is integrated with robot motion using an intelligent control architecture that can automatically reconfigure the robot's behaviors in response to environmental changes.

Gesture recognition using the Perseus architecture

TLDR
This paper describes Perseus in detail and shows how it is used to locate objects pointed to by people and how it can be re-used in tasks other than pointing.
...