What does the person feel? Learning to infer applied forces during robot-assisted dressing

  title={What does the person feel? Learning to infer applied forces during robot-assisted dressing},
  author={Zackory M. Erickson and Alexander Clegg and Wenhao Yu and Greg Turk and C. Karen Liu and Charles C. Kemp},
  journal={2017 IEEE International Conference on Robotics and Automation (ICRA)},
During robot-assisted dressing, a robot manipulates a garment in contact with a person's body. Inferring the forces applied to the person's body by the garment might enable a robot to provide more effective assistance and give the robot insight into what the person feels. However, complex mechanics govern the relationship between the robot's end effector and these forces. Using a physics-based simulation and data-driven methods, we demonstrate the feasibility of inferring forces across a person… 

Deep Haptic Model Predictive Control for Robot-Assisted Dressing

A deep recurrent model is presented that, when given a proposed action by the robot, predicts the forces a garment will apply to a person's body and it is shown that a robot can provide better dressing assistance by using this model with model predictive control.

Learning Human Behaviors for Robot-Assisted Dressing

This work investigates robotic assistants for dressing that can anticipate the motion of the person who is being helped and creates a model of human behavior that is capable of placing the arm into the sleeve through reinforcement learning.

Learning garment manipulation policies toward robot-assisted dressing

A dressing pipeline intended for people who have completely lost their upper limb movement functionality is reported and enables a dual-arm robot to put back-opening hospital gowns onto a medical manikin with a success rate of more than 90%.

Tracking Human Pose During Robot-Assisted Dressing Using Single-Axis Capacitive Proximity Sensing

Using this method, a robot can adjust for errors in the estimated pose of a person and physically follow the contours and movements of the person while providing dressing assistance and it is shown that a capacitive sensor is unaffected by visual occlusion of the body and can sense a person's body through cotton clothing.

Learning Grasping Points for Garment Manipulation in Robot-Assisted Dressing

  • Fan ZhangY. Demiris
  • Computer Science
    2020 IEEE International Conference on Robotics and Automation (ICRA)
  • 2020
This paper introduces a robot-assisted dressing system that combines the grasping point prediction method, with a grasping and manipulation strategy which takes grasping orientation computation and robot-garment collision avoidance into account and is capable of yielding accurate grasping point estimations.

Learning to Collaborate From Simulation for Robot-Assisted Dressing

This work investigated the application of haptic feedback control and deep reinforcement learning to robot-assisted dressing and found that training policies for specific impairments dramatically improved performance; that controller execution speed could be scaled after training to reduce the robot's speed without steep reductions in performance.

“Elbows Out”—Predictive Tracking of Partially Occluded Pose for Robot-Assisted Dressing

Recurrent neural network models were built to predict the elbow position of a single arm based on other features of the user pose, and the model was evaluated on Kinect data for a robot dressing task and demonstrates potential for this application.

Probabilistic Real-Time User Posture Tracking for Personalized Robot-Assisted Dressing

A probabilistic tracking method using Bayesian networks in latent spaces, which fuses robot end-effector positions and force information to enable cameraless and real-time estimation of the user postures during dressing and enables the Baxter robot to provide personalized dressing assistance in putting on a sleeveless jacket for users with upper-body impairments.

Characterizing Multidimensional Capacitive Servoing for Physical Human-Robot Interaction

The results indicate that multidimensional capacitive servoing enables a robot’s end effector to move proximally or distally along human limbs while adapting to human pose and generalizes well across people with different body size.

Personalized Robot Assistant for Support in Dressing

A multimodal robotic system for a specific dressing scenario—putting on a shoe, where users’ personalized inputs contribute to a much improved task success rate and smaller number of user commands, and reduced workload is described.



Data-driven haptic perception for robot-assisted dressing

Evidence is provided that data-driven haptic perception can be used to infer relationships between clothing and the human body during robot-assisted dressing, and hidden Markov models using only forces measured at the robot's end effector classified these outcomes with high accuracy.

User modelling for personalised dressing assistance by humanoid robots

This paper proposes a method which is based on real-time upper-body pose and user models to plan robot motions for assistive dressing, and validates each part of the approach and test the whole system, allowing a Baxter humanoid robot to assist human to wear a sleeveless jacket.

Haptic simulation for robot-assisted dressing

This work presents a system that learns a haptic classifier for the outcome of the task given few (2–3) real-world trials with one person, and shows that the classifiers from this system can categorize the dressing task outcomes more accurately than classifiers trained on ten times more real data.

Reinforcement learning of clothing assistance with a dual-arm robot

This study proposes to use reinforcement learning with the cloth's state which is low-dimensionally represented in topology coordinates, and with the reward defined in the low-dimensional coordinates to overcome difficulties of clothing assistance.

Personalized Assistance for Dressing Users

Given a dressing task, the approach finds a solution involving manipulator motions and also user repositioning requests that allows the robot and user to take turns moving in the same space and is cognizant of the user’s limitations.

Bottom dressing by a life-sized humanoid robot provided failure detection and recovery functions

This paper describes dressing assistance by an autonomous robot. We especially focus on a dressing action that is particularly problematic for disabled people: the pulling of a bottom along the legs.

Interaction skills for a coat-check robot: Identifying and handling the boundary components of clothes

  • Lukas TwardonH. Ritter
  • Computer Science
    2015 IEEE International Conference on Robotics and Automation (ICRA)
  • 2015
A novel graph-based approach to detecting boundary components by extracting closed contours from depth images is presented and a planner minimizing a heuristic energy function for an optimal grasp pose of a robot hand around the boundary of a garment is suggested.

Iterative path optimisation for personalised dressing assistance using vision and force information

  • Yixing GaoH. ChangY. Demiris
  • Computer Science, Business
    2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
  • 2016
A new stochastic path optimisation method based on adaptive moment estimation is proposed that achieves the smallest error with fewer iterations and less computation time and is evaluated by enabling the Baxter robot to assist real human users with their dressing.

Exploring the effects of dimensionality reduction in deep networks for force estimation in robotic-assisted surgery

Positive effects of doing dimensionality reduction on deep networks including: faster training, improved network performance, and overfitting prevention are demonstrated.

Estimation of Human Cloth Topological Relationship using Depth Sensor for Robotic Clothing Assistance

It is demonstrated that the depth sensor can provide reliable estimates of topology coordinates and can replace the complex and expensive setup of motion capture system.