Learn More
Collecting grasp data for learning and benchmarking purposes is very expensive. It would be helpful to have a standard database of graspable objects, along with a set of stable grasps for each object, but no such database exists. In this work we show how to automate the construction of a database consisting of several hands, thousands of objects, and(More)
Recent studies show contradictory results regarding the contribution of endocannabinoids in fear memory formation and long-term synaptic plasticity. In this study, we investigated the effects of both cannabinoid receptor type 1 (CB1 receptor) antagonist AM281 and anandamide reuptake inhibitor AM404 on the formation of contextual fear memory in adult mice.(More)
To grasp a novel object, we can index it into a database of known 3D models and use precomputed grasp data for those models to suggest a new grasp. We refer to this idea as data-driven grasping, and we have previously introduced the Columbia Grasp Database for this purpose. In this paper we demonstrate a data-driven grasp planner that requires only partial(More)
We deal with the problem of teaching a robot to manipulate everyday objects through human demonstration. We first design a task descriptor which encapsulates important elements of a task. The design originates from observations that manipulations involved in many everyday object tasks can be considered as a series of sequential rotations and translations,(More)
We propose a machine learning approach to the perception of a stable robotic grasp based on tactile feedback and hand kinematic data, which we call blind grasping. We first discuss a method for simulating tactile feedback using a soft finger contact model in GraspIt!, which is a robotic grasping simulator [10]. Using this simulation technique, we compute(More)
We design an example based planning framework to generate semantic grasps, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode semantic constraints, which are task-related constraints. We introduce a semantic affordance(More)
We deal with the problem of blind grasping where we use tactile feedback to predict the stability of a robotic grasp given no visual or geometric information about the object being grasped. We first simulated tactile feedback using a soft finger contact model in GraspIt! [1] and computed tactile contacts of thousands of grasps with a robotic hand using the(More)
We address the problem of synthesizing human-readable computer programs for robotic object repositioning tasks based on human demonstrations. A stack-based domain specific language (DSL) is introduced for object repositioning tasks, and a learning algorithm is proposed to synthesize a program in this DSL based on human demonstrations. Once the synthesized(More)
Humanoid robots are attempting ever more complex tasks in lieu of humans. Disaster response is a promising area for the use of humanoids due to safety concerns. However, controlling a high DOF humanoid robot to autonomously perform a complex task in unknown and unstructured environments is challenging. In this paper we describe a simulation framework for(More)