Learn More
Collecting grasp data for learning and benchmarking purposes is very expensive. It would be helpful to have a standard database of graspable objects, along with a set of stable grasps for each object, but no such database exists. In this work we show how to automate the construction of a database consisting of several hands, thousands of objects, and(More)
We design an example based planning framework to generate semantic grasps, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode semantic constraints, which are task-related constraints. We introduce a semantic affordance(More)
We propose a machine learning approach to the perception of a stable robotic grasp based on tactile feedback and hand kinematic data, which we call blind grasping. We first discuss a method for simulating tactile feedback using a soft finger contact model in GraspIt!, which is a robotic grasping simulator [10]. Using this simulation technique, we compute(More)
We deal with the problem of blind grasping where we use tactile feedback to predict the stability of a robotic grasp given no visual or geometric information about the object being grasped. We first simulated tactile feedback using a soft finger contact model in GraspIt! [1] and computed tactile contacts of thousands of grasps with a robotic hand using the(More)
We deal with the problem of teaching a robot to manipulate everyday objects through human demonstration. We first design a task descriptor which encapsulates important elements of a task. The design originates from observations that manipulations involved in many everyday object tasks can be considered as a series of sequential rotations and translations,(More)
We address the problem of synthesizing human-readable computer programs for robotic object repositioning tasks based on human demonstrations. A stack-based domain specific language (DSL) is introduced for object repositioning tasks, and a learning algorithm is proposed to synthesize a program in this DSL based on human demonstrations. Once the synthesized(More)
In this paper we present a method for studying human selection of fingertip contact point locations during grasping and manipulation. Our aim is to perform a functional analysis, looking at how a particular choice of contact point distribution affects the subjects’ ability to resist external forces applied to the grasped object. We rely on grasp quality(More)
We propose an experience-based approach to the problem of blind grasping, stable robotic grasping using tactile sensing and hand kinematic feedback. We first collect a set of stable grasps to build a tactile experience database which contains tactile contacts for each stable grasp. Using the tactile experience database, we propose an algorithm to synthesize(More)