Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities

@article{Cherubini2020SensorBasedCF,
  title={Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities},
  author={Andrea Cherubini and David Navarro-Alarcon},
  journal={Frontiers in Neurorobotics},
  year={2020},
  volume={14}
}
The objective of this paper is to present a systematic review of existing sensor-based control methodologies for applications that involve direct interaction between humans and robots, in the form of either physical collaboration or safe coexistence. To this end, we first introduce the basic formulation of the sensor-servo problem, and then, present its most common approaches: vision-based, touch-based, audio-based, and distance-based control. Afterwards, we discuss and formalize the methods… 

Figures and Tables from this paper

A General Visual-Impedance Framework for Effectively Combining Vision and Force Sensing in Feature Space

TLDR
This letter proposes a general framework for combining force and visual information in the visual feature space by leveraging recent results on the derivation of visual servo dynamics, and generalizes the treatment regardless of the visual features chosen.

Formulating Intuitive Stack-of-Tasks with Visuo-Tactile Perception for Collaborative Human-Robot Fine Manipulation

TLDR
An intuitive stack-of-t tasks (iSoT) formulation is proposed, that defines the robot’s actions by considering the human-arm postures and the task progression, augmented with visuo-tactile information to effectively perceive the collaborative environment and intuitively switch between the planned sub-tasks.

Impact of Shared Control Modalities on Performance and Usability of Semi-autonomous Prostheses

TLDR
It is demonstrated that a specific approach for integrating volitional and autonomous control is indeed an important factor that significantly affects the performance and physical and cognitive load, and therefore these should be considered when designing SA prostheses.

Transferring artificial intelligence practices between collaborative robotics and autonomous driving

PurposeCollaborative robotics and autonomous driving are fairly new disciplines, still with a long way to go to achieve goals, set by the research community, manufacturers and users. For technologies

Multimodal Immersive Learning with Artificial Intelligence for Robot and Running application cases

TLDR
The MILKI-PSY project is presented, whose main goal is to provide a onefor-all system across different domains, and it is suggested that the system must give to the user the freedom to decide what sensor data to use, and which feedback to receive.

Intuitive Tasks Planning Using Visuo-Tactile Perception for Human Robot Cooperation

TLDR
This research proposes to formulate intuitive robotic tasks following human viewpoint by incorporating visuo-tactile perception in response to co-manipulation tasks.

Collaborative Robots and Tangled Passages of Tactile-Affects

Collaborative robots are increasingly entering industrial contexts and workflows. These contexts are not just locations for production, they are vibrant social and sensory environments. For better or

Manipulability Optimization of a Rehabilitative Collaborative Robotic System

TLDR
This study presents a design optimization of a robotic system for upper limb rehabilitation based on the manipulability ellipsoid method and identifies the optimal position of the robot base with respect to the patient.

Paint With the Sun: A Thermal-Vision Guided Robot to Harness Solar Energy for Heliography

In this article, we present a novel robotic system for heliography, which literally refers to painting with the Sun. In the context of artistic creation, heliography involves the precise manipulation

Fusing Visuo-Tactile Perception into Kernelized Synergies for Robust Grasping and Fine Manipulation of Non-rigid Objects

TLDR
Experiments performed with robot armhand system validates the capability and usability of upgraded framework on stably grasping and dexterously manipulating the non-rigid objects.

References

SHOWING 1-10 OF 106 REFERENCES

A unified multimodal control framework for human-robot interaction

Progress and prospects of the human–robot collaboration

TLDR
The main purpose of this paper is to review the state-of-the-art on intermediate human–robot interfaces (bi-directional), robot control modalities, system stability, benchmarking and relevant use cases, and to extend views on the required future developments in the realm of human-robot collaboration.

Robot Collisions: A Survey on Detection, Isolation, and Identification

TLDR
This survey paper review, extend, compare, and evaluate experimentally model-based algorithms for real-time collision detection, isolation, and identification that use only proprioceptive sensors that cover the context-independent phases of the collision event pipeline for robots interacting with the environment.

A Multi-Sensorial Hybrid Control for Robotic Manipulation in Human-Robot Workspaces

TLDR
An intelligent multi-sensorial approach that solves the issue of autonomous manipulation in semi-structured environments by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks is described.

Collaborative human-humanoid carrying using vision and haptic sensing

TLDR
A framework for combining vision and haptic information in human-robot joint actions that consists of a hybrid controller that uses both visual servoing and impedance controllers to allow for a more collaborative setup.

Integrated Vision/Force Robotic Servoing in the Task Frame Formalism

TLDR
This work shows the usefulness and feasibility of a range of tasks which use shared control, and offers a framework based on the task frame formalism (TFF) to distinguish between different basic forms of shared control.

Human-Intent Detection and Physically Interactive Control of a Robot Without Force Sensors

TLDR
A switching scheme is developed that goes between the modes of pure impedance control with a fixed-position reference and interactive control under human intent and puts the robot into interactive mode, if there is any.

From multi-modal tactile signals to a compliant control

TLDR
An end-to-end approach transforming multi-modal tactile signals into a compliant control to generate different dynamic robot behaviors to produce safer robots, especially for physical Human-Robot Interaction.
...