M/ORIS: a medical/operating room interaction system

@inproceedings{Grange2004MORISAM,
  title={M/ORIS: a medical/operating room interaction system},
  author={S{\'e}bastien Grange and Terrence Fong and Charles Baur},
  booktitle={ICMI '04},
  year={2004}
}
We propose an architecture for a real-time multimodal system, which provides non-contact, adaptive user interfacing for Computer-Assisted Surgery (CAS). The system, called M/ORIS (for Medical/Operating Room Interaction System) combines gesture interpretation as an explicit interaction modality with continuous, real-time monitoring of the surgical activity in order to automatically address the surgeon's needs. Such a system will help reduce a surgeon's workload and operation time. This paper… 

Figures and Tables from this paper

Gestures for Picture Archiving and Communication Systems (PACS) operation in the operating room: Is there any standard?

TLDR
The level of agreement among surgeons over the best gestures for PACS operation is higher than the previously reported metric, and indicates a majority preference, and is better than using gestures based on authoritarian or arbitrary approaches.

GazeTap: towards hands-free interaction in the operating room

TLDR
This paper proposes two interaction techniques which use gaze and foot as input modalities for hands-free interaction which may enable more effective computer interactions in the operating room, resulting in a more beneficial use of medical information.

Virtual Reality for User-Centered Design and Evaluation of Touch-free Interaction Techniques for Navigating Medical Images in the Operating Room

TLDR
An interactive virtual operating room (IVOR) is presented, intended as a tool to develop and study interaction methods for the OR, and two novel touch-free interaction techniques using hand and foot gestures.

A Method of 3D Hand Movement Recognition by a Leap Motion Sensor for Controlling Medical Image in an Operating Room

TLDR
A method to control medical image in an operating room by using only a hand, which has total ten commands divided to six commands on program screen, and increasing and decreasing brightness.

Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images.

TLDR
Experimental results show that gesture interaction and surgeon behavior analysis can be used to accurately navigate, manipulate and access MRI images, and therefore this modality could replace the use of keyboard and mice-based interfaces.

Hand, Foot or Voice: Alternative Input Modalities for Touchless Interaction in the Medical Domain

TLDR
This work investigated touchless input methods as alternatives to each other with focus on two common interaction tasks in sterile settings: activation of a system to avoid unintentional input and manipulation of continuous values.

Development of a sterile Interaction Device during Image guided minimal-invasive Interventions

TLDR
The development and evaluation of a sterile, MR-compatible capacitive gesture controller allows the practicing physician to manipulate the imaging independently and in a sterile manner, thus enabling a significant improvement in work-flow.

Voice User Interface Design for a Computer Aided Ureteroscopic Surgical System

TLDR
This paper presents a design for a computer aided surgical (CAS) system with a voice user interface (VUI) using Sami, a purpose-designed CAS system with VUI, and compares conventional UI and the VUI of Sami during Ureteroscopic surgery.

Robust Real-time 3D Detection of Obstructed Head and Hands in Indoors Environments

TLDR
This work proposes a new method for head and hands detection that relies on geometric information from disparity maps, locally refined by color processing, and successfully found more than 97% of target features, with very few false positives.

Vision-Based User Interfaces for Health Applications: A Survey

  • A. Albu
  • Computer Science, Medicine
    ISVC
  • 2006
This paper proposes a survey of vision-based human computer interfaces for several key-fields in health care: data visualization for image-guided diagnosis, image-guided therapy planning and surgery,

References

SHOWING 1-10 OF 42 REFERENCES

The expert surgical assistant. An intelligent virtual environment with multimodal input.

TLDR
This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant to monitor the user's progress and provide automatic feedback.

A non-contact mouse for surgeon-computer interaction.

TLDR
A system that uses computer vision to replace standard computer mouse functions with hand gestures, designed to enable non-contact human-computer interaction (HCI) so that surgeons will be able to make more effective use of computers during surgery.

Comparison of an Optical and a Mechanical Navigation System

TLDR
FARO Arm showed an intrinsic uncertainty of 0.1mm overcoming FlashPoint (0.2mm), however their use in noisy conditions resulted in similar application-uncertainty, and FlashPoint use in in-vivo conditions is more promising than FARO-Arm's.

An Augmentation System for Fine Manipulation

TLDR
This work explores a sensor driven system to perform simple manipulation tasks composed of a core set of “safe” system states and task specific states and transitions and investigates using the “steady hand” robot as the experimental platform.

The "Bernese" frameless optical computer aided surgery system.

TLDR
The frameless Computer Aided Surgery system developed in Bern, Switzerland has proven its accuracy and usability and surgeons feel very comfortable with the increased safety provided by the unequivocal identification of important anatomical structures.

Selective Use of Face Gesture Interface and Instrument Tracking System for Control of a Robotic Laparoscope Positioner

TLDR
This work investigated the effectiveness of the selective use of automatic/manual control of a robotic laparoscope positioner and found it effective, although it does not always provide an optimal view.

A System for Real-Time Endoscopic Image Enhancement

TLDR
A system for real-time endoscopic image enhancement: a typical video-endoscopic system was extended by a computer and a second monitor, so the enhanced and the original image can be displayed at the same time.

Vision-based user interfaces: methods and applications

  • M. Porta
  • Computer Science
    Int. J. Hum. Comput. Stud.
  • 2002
TLDR
A global view on the field of vision-based interfaces (VBIs) is provided, through the analysis of the methods used for their implementation and the exploration of the practical systems in which they have been employed.

A Framework for Determining Component and Overall Accuracy for Computer Assisted Surgery Systems

TLDR
A framework to assess the accuracy of CAS systems will provide developers and researchers with a common starting point with which to test their systems, and will provide surgeons with a consistent basis for comparing different systems.

A Bayesian Computer Vision System for Modeling Human Interaction

TLDR
A real-time computer vision and machine learning system for modeling and recognizing human behaviors in a visual surveillance task and the ability to use these a priori models to accurately classify real human behaviors and interactions with no additional tuning or training is demonstrated.