Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities

@article{Cherubini2020SensorBasedCF,
  title={Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities},
  author={Andrea Cherubini and David Navarro-Alarcon},
  journal={Frontiers in Neurorobotics},
  year={2020},
  volume={14}
}
The objective of this paper is to present a systematic review of existing sensor-based control methodologies for applications that involve direct interaction between humans and robots, in the form of either physical collaboration or safe coexistence. To this end, we first introduce the basic formulation of the sensor-servo problem, and then, present its most common approaches: vision-based, touch-based, audio-based, and distance-based control. Afterwards, we discuss and formalize the methods… 

Figures and Tables from this paper

A Novel Velocity-Based Control in a Sensor Space for Parallel Manipulators

It is a challenging task to track objects moving along an unknown trajectory. Conventional model-based controllers require detailed knowledge of a robot’s kinematics and the target’s trajectory.

A General Visual-Impedance Framework for Effectively Combining Vision and Force Sensing in Feature Space

This letter proposes a general framework for combining force and visual information in the visual feature space by leveraging recent results on the derivation of visual servo dynamics, and generalizes the treatment regardless of the visual features chosen.

Force Control in Robotics: A Review of Applications

  • P. Gierlak
  • Materials Science
    Journal of Robotics and Mechanical Engineering
  • 2021
The aim of this article is to present an overview of the most important robotic processes in which force control methods are applied. In recent years, robotization has seen a rapid increase in the

Formulating Intuitive Stack-of-Tasks with Visuo-Tactile Perception for Collaborative Human-Robot Fine Manipulation

An intuitive stack-of-t tasks (iSoT) formulation is proposed, that defines the robot’s actions by considering the human-arm postures and the task progression, augmented with visuo-tactile information to effectively perceive the collaborative environment and intuitively switch between the planned sub-tasks.

Transferring artificial intelligence practices between collaborative robotics and autonomous driving

PurposeCollaborative robotics and autonomous driving are fairly new disciplines, still with a long way to go to achieve goals, set by the research community, manufacturers and users. For technologies

Intelligent Interactive Control via Haptic E‐skin for Human–Robot Interaction and Collaboration

A task learning is demonstrated for a robot from human, that is, a human teaches a robot how to pick up, carry, and place an object into a box by hand‐by‐hand teaching, which shows promising prospects in intelligent robot applications.

Multimodal Immersive Learning with Artificial Intelligence for Robot and Running application cases

The MILKI-PSY project is presented, whose main goal is to provide a onefor-all system across different domains, and it is suggested that the system must give to the user the freedom to decide what sensor data to use, and which feedback to receive.

Intuitive Tasks Planning Using Visuo-Tactile Perception for Human Robot Cooperation

This research proposes to formulate intuitive robotic tasks following human viewpoint by incorporating visuo-tactile perception in response to co-manipulation tasks.

Collaborative Robots and Tangled Passages of Tactile-Affects

Collaborative robots are increasingly entering industrial contexts and workflows. These contexts are not just locations for production, they are vibrant social and sensory environments. For better or

Manipulability Optimization of a Rehabilitative Collaborative Robotic System

This study presents a design optimization of a robotic system for upper limb rehabilitation based on the manipulability ellipsoid method and identifies the optimal position of the robot base with respect to the patient.

References

SHOWING 1-10 OF 106 REFERENCES

A unified multimodal control framework for human-robot interaction

Progress and prospects of the human–robot collaboration

The main purpose of this paper is to review the state-of-the-art on intermediate human–robot interfaces (bi-directional), robot control modalities, system stability, benchmarking and relevant use cases, and to extend views on the required future developments in the realm of human-robot collaboration.

Robot Collisions: A Survey on Detection, Isolation, and Identification

This survey paper review, extend, compare, and evaluate experimentally model-based algorithms for real-time collision detection, isolation, and identification that use only proprioceptive sensors that cover the context-independent phases of the collision event pipeline for robots interacting with the environment.

A Multi-Sensorial Hybrid Control for Robotic Manipulation in Human-Robot Workspaces

An intelligent multi-sensorial approach that solves the issue of autonomous manipulation in semi-structured environments by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks is described.

Collaborative human-humanoid carrying using vision and haptic sensing

A framework for combining vision and haptic information in human-robot joint actions that consists of a hybrid controller that uses both visual servoing and impedance controllers to allow for a more collaborative setup.

Integrated Vision/Force Robotic Servoing in the Task Frame Formalism

This work shows the usefulness and feasibility of a range of tasks which use shared control, and offers a framework based on the task frame formalism (TFF) to distinguish between different basic forms of shared control.

Human-Intent Detection and Physically Interactive Control of a Robot Without Force Sensors

A switching scheme is developed that goes between the modes of pure impedance control with a fixed-position reference and interactive control under human intent and puts the robot into interactive mode, if there is any.

From multi-modal tactile signals to a compliant control

An end-to-end approach transforming multi-modal tactile signals into a compliant control to generate different dynamic robot behaviors to produce safer robots, especially for physical Human-Robot Interaction.
...