Jochen Heinzmann

Learn More
If robots are to be introduced into the human world as assistants to aid a person in the completion of a manual task two key problems of today's robots must be solved. The human–robot interface must be intuitive to use and the safety of the user with respect to injuries inflicted by collisions with the robot must be guaranteed. In this paper we describe the(More)
People naturally express themselves through facial gestures and expressions. Our goal is to build a facial gesture human-computer interface for use in robot applications. We have implemented an interface that tracks a person's facial features in real time (30Hz). Our system does not require special illumination nor facial makeup. By using multiple Kalman(More)
Facial pose and gaze point are fundamental to any visually directed human-machine interface. In this paper, we propose a system capable of tracking a face and estimating the 3-D pose and the gaze point all in a real-time video stream of the head. This is done by using a 3-D model together with multiple triplet triangulation of feature positions assuming an(More)
This paper describes the implementation of behaviour for real-time visual servoing on a mobile robot. The behaviour is a component of a multi-robot cleaning system developed in the context of our investigation into architectures for cooperative systems. An important feature for support of cooperation is the awareness of one robot by another, which this(More)
This paper introduces a new approach to control a robot manipulator in a way that is safe for humans in the robots workspace. Conceptually the robot is viewed as a tool with limited autonomy. The limited perception capabilities of automatic systems prohibits the construction of failsafe robots of the capability of people Instead, the goal of our control(More)
To develop human friendly robots we required two key components ; smart interfaces and safe mechanisms. Smart interfaces facilitate natural and easy interfaces for human-robot interaction. Facial gestures can be a natural way to control a robot. In this paper, we report on a vision-based interface that in real-time tracks a user's facial features and gaze(More)
  • 1