Hand gesture guided robot-assisted surgery based on a direct augmented reality interface

Abstract

Radiofrequency (RF) ablation is a good alternative to hepatic resection for treatment of liver tumors. However, accurate needle insertion requires precise hand-eye coordination and is also affected by the difficulty of RF needle navigation. This paper proposes a cooperative surgical robot system, guided by hand gestures and supported by an augmented reality (AR)-based surgical field, for robot-assisted percutaneous treatment. It establishes a robot-assisted natural AR guidance mechanism that incorporates the advantages of the following three aspects: AR visual guidance information, surgeon's experiences and accuracy of robotic surgery. A projector-based AR environment is directly overlaid on a patient to display preoperative and intraoperative information, while a mobile surgical robot system implements specified RF needle insertion plans. Natural hand gestures are used as an intuitive and robust method to interact with both the AR system and surgical robot. The proposed system was evaluated on a mannequin model. Experimental results demonstrated that hand gesture guidance was able to effectively guide the surgical robot, and the robot-assisted implementation was found to improve the accuracy of needle insertion. This human-robot cooperative mechanism is a promising approach for precise transcutaneous ablation therapy.

DOI: 10.1016/j.cmpb.2013.12.018

Extracted Key Phrases

13 Figures and Tables

0102030201520162017
Citations per Year

Citation Velocity: 10

Averaging 10 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Cite this paper

@article{Wen2014HandGG, title={Hand gesture guided robot-assisted surgery based on a direct augmented reality interface}, author={Rong Wen and Wei-Liang Tay and Binh P. Nguyen and Chin-Boon Chng and Chee-Kong Chui}, journal={Computer methods and programs in biomedicine}, year={2014}, volume={116 2}, pages={68-80} }