Generalization of human grasping for multi-fingered robot hands

@article{Amor2012GeneralizationOH,
  title={Generalization of human grasping for multi-fingered robot hands},
  author={H. B. Amor and Oliver Kroemer and U. Hillenbrand and G. Neumann and Jan Peters},
  journal={2012 IEEE/RSJ International Conference on Intelligent Robots and Systems},
  year={2012},
  pages={2043-2050}
}
Multi-fingered robot grasping is a challenging problem that is difficult to tackle using hand-coded programs. In this paper we present an imitation learning approach for learning and generalizing grasping skills based on human demonstrations. To this end, we split the task of synthesizing a grasping motion into three parts: (1) learning efficient grasp representations from human demonstrations, (2) warping contact points onto new objects, and (3) optimizing and executing the reach-and-grasp… Expand
Data-Efficient Learning of Robotic Grasps From Human Preferences
The ability to grasp various types of objects from the environment is an important manipulation skill for autonomous robots. It poses a prerequisite for solving many real-world tasks, ranging fromExpand
Simplifying grasping complexity through generalization of kinaesthetically learned synergies
TLDR
A hybrid technique based on grasping synergies extracted from kinaesthetic demonstrations on a given object with a primitive geometry and passive kinematic enveloping as a generalization technique is proposed. Expand
Dexterous Robotic Grasping with Object-Centric Visual Affordances
TLDR
The key idea is to embed an object-centric visual affordance model within a deep reinforcement learning loop to learn grasping policies that favor the same object regions favored by people. Expand
Unsupervised learning of predictive parts for cross-object grasp transfer
  • R. Detry, J. Piater
  • Computer Science
  • 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems
  • 2013
TLDR
This work identifies, through autonomous exploration, the size and shape of object parts that consistently predict the applicability of a grasp across multiple objects, and aims to solve the part-learning problem without the help of a human teacher. Expand
Learning and Inference of Dexterous Grasps for Novel Objects with Underactuated Hands
TLDR
The contact model learning algorithm is extended to work with multiple training examples for each grasp type, and enables the ability to learn which parts of the hand reliably interact with the object during a particular grasp. Expand
Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection
TLDR
The approach achieves effective real-time control, can successfully grasp novel objects, and corrects mistakes by continuous servoing, and illustrates that data from different robots can be combined to learn more reliable and effective grasping. Expand
ContactGrasp: Functional Multi-finger Grasp Synthesis from Contact
TLDR
This work synthesizes functional grasps for three hand models and two functional intents for functional grasp synthesis from object shape and contact on the object surface using a dataset of contact demonstrations from humans grasping diverse household objects. Expand
Robot grasp planning based on demonstrated grasp strategies
TLDR
The study results show that the grasp strategies of grasp type and thumb placement not only represent important human grasp intentions, but also provide meaningful constraints on hand posture and wrist position, which highly reduce both the feasible workspace of a robotic hand and the search space of the grasp planning. Expand
Learning Postural Synergies for Categorical Grasping Through Shape Space Registration
TLDR
A novel method for inferring grasp configurations based on the object shape by means of a categorical non-rigid registration that encodes typical intra-class variations suitable for on-line scenarios where only a portion of the object's surface is observable. Expand
Modelling the structure of object-independent human affordances of approaching to grasp for robotic hands
TLDR
This work showed that approach to grasp can be structured in four distinct phases that are best represented by non-linear models, independent from the objects being handled, which suggests that approaching to grasp patterns are following an intentionally planned control strategy, rather than implementing a reactive execution. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 32 REFERENCES
Hand Posture Subspaces for Dexterous Robotic Grasping
TLDR
An on-line grasp planner that allows a human operator to perform dexterous grasping tasks using an artificial hand in a hand posture subspace of highly reduced dimensionality is presented. Expand
Grasp synthesis from low‐dimensional probabilistic grasp models
TLDR
A novel data‐driven animation method for the synthesis of natural looking human grasping that greatly reduces the high number of degrees of freedom of the human hand to a few dimensions in a continuous grasp space. Expand
Hand synergies during reach-to-grasp.
TLDR
Much of reach-to-grasp is effected using a base posture with refinements in finger and thumb positions added in time to yield unique hand shapes, suggesting that the CNS uses synergies to simplify the control of the hand. Expand
Demonstration-based learning and control for automatic grasping
We present a method for automatic grasp generation based on object shape primitives in a Programming by Demonstration framework. The system first recognizes the grasp performed by a demonstrator asExpand
Learning robot grasping from 3-D images with Markov Random Fields
TLDR
This work learns a function that predicts the success probability of grasps performed on surface points of a given object, based on Markov Random Fields, and motivated by the fact that points that are geometrically close to each other tend to have similar grasp success probabilities. Expand
Transferring functional grasps through contact warping and local replanning
  • U. Hillenbrand, M. A. Roa
  • Engineering, Computer Science
  • 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems
  • 2012
TLDR
This work presents extensive results of experiments with a database of four-finger grasps, designed to systematically cover variations on grasping the mugs of the Princeton Shape Benchmark. Expand
Learning perceptual coupling for motor primitives
TLDR
An augmented version of the dynamic system-based motor primitives which incorporates perceptual coupling to an external variable is proposed which can perform complex tasks such a Ball-in-a-Cup or Kendama task even with large variances in the initial conditions where a skilled human player would be challenged. Expand
Encoding of Coordinated Reach and Grasp Trajectories in Primary Motor Cortex
TLDR
A generalized linear model is used to predict single neuron responses in primary motor cortex during a reach-to-grasp task based on 40 features that represent positions and velocities of the arm and hand in joint angle and Cartesian coordinates as well as the neurons' own spiking history. Expand
Monkey hand postural synergies during reach-to-grasp in the absence of vision of the hand and object.
TLDR
Hand shaping during the reach occurred without vision of the hand or object, and hand kinematics were not dependent on grasp force, implying that the kinematic of reach-to-grasp and grasp force are controlled independently. Expand
Robotic Grasping of Novel Objects using Vision
TLDR
This work considers the problem of grasping novel objects, specifically objects that are being seen for the first time through vision, and presents a learning algorithm that neither requires nor tries to build a 3-d model of the object. Expand
...
1
2
3
4
...