• Publications
  • Influence
An In Depth View of Saliency
Presented at the 24th British Machine Vision Conference (BMVC 2013), 9-13 September 2013, Bristol, UK.
Movie genre classification via scene categorization
TLDR
This paper presents a method for movie genre categorization of movie trailers, based on scene categorization, and demonstrates that exploiting scene structures improves film genre classification using only low-level visual features.
Affordance Prediction via Learned Object Attributes
We present a novel method for learning and predicting the affordances of an object based on its physical and visual attributes. Affordance prediction is a key task in autonomous robot learning, as it
Planning Multi-Fingered Grasps as Probabilistic Inference in a Learned Deep Network
TLDR
This work is the first to directly plan high quality multifingered grasps in configuration space using a deep neural network without the need of an external planner and shows that the planning method outperforms existing planning methods for neural networks.
Decoupling behavior, perception, and control for autonomous learning of affordances
TLDR
This work demonstrates the approach using a PR2 robot that explores different combinations of controller, behavior primitive, and proxy to perform a push or pull positioning behavior on a selection of household objects, learning which methods best work for each object.
Stabilizing novel objects by learning to predict tactile slip
TLDR
This work explores the generalization capabilities of well known supervised learning methods, using random forest classifiers to create generalizable slip predictors in the feedback loop of an object stabilization controller and shows that the controller can successfully stabilize previously unknown objects by predicting and counteracting slip events.
Modeling Grasp Type Improves Learning-Based Grasp Planning
TLDR
This paper proposes a probabilistic grasp planner that explicitly models grasp type for planning high-quality precision and power grasps in real time and shows the benefit of learning a prior over grasp configurations to improve grasp inference with a learned classifier.
Learning robot in-hand manipulation with tactile features
TLDR
This approach successfully acquires a tactile manipulation skill using a passively compliant hand and it is shown that the learned tactile skill generalizes to novel objects.
Evaluation of tactile feature extraction for interactive object recognition
TLDR
It is shown that by combining simple statistical features captured from five robot motions, this robot can reliably differentiate between a diverse set of 49 objects with an average classification accuracy of 97.6 ± 2.12%.
Active tactile object exploration with Gaussian processes
TLDR
This paper presents an active touch strategy to efficiently reduce the surface geometry uncertainty by leveraging a probabilistic representation of object surface using a Gaussian process.
...
1
2
3
4
5
...