John M. Galeotti

Learn More
We present a novel device mounted on the fingertip for acquiring and transmitting visual information through haptic channels. In contrast to previous systems in which the user interrogates an intermediate representation of visual information, such as a tactile display representing a camera generated image, our device uses a fingertip-mounted camera and(More)
This paper describes a method for building small, inexpensive, autonomous mobile robot systems in order to study into robot colonies. We describe how to build EvBot mobile robot colonies that can be used to navigate through mazes of varying complexity, display intelligent control, produce evolutionary computing algorithms, and use low-bandwidth distributed(More)
This paper presents results generated with a new evolutionary robotics (ER) simulation environment and its complementary real mobile robot colony research test-bed. Neural controllers producing mobile robot maze searching and exploration behaviors using binary tactile sensors as inputs were evolved in a simulated environment and subsequently transferred to(More)
We are developing techniques for guiding ultrasound probes and other clinical tools with respect to the exterior of the patient, using one or more video camera(s) mounted directly on the probe or tool. This paper reports on a new method of matching the real-time video image of the patient’s exterior against a prior high-resolution surface map acquired with(More)
We present a novel and relatively simple method for magnifying forces perceived by an operator using a tool. A sensor measures the force between the tip of a tool and its handle held by the operator’s fingers. These measurements are used to create a proportionally greater force between the handle and a brace attached to the operator’s hand, providing an(More)
We have developed a new image-based guidance system for microsurgery using optical coherence tomography (OCT), which presents a virtual image in its correct location inside the scanned tissue. Applications include surgery of the cornea, skin, and other surfaces below which shallow targets may advantageously be displayed for the naked eye or low-power(More)
The concept and instantiation of real-time tomographic holography (RTTH) for augmented reality is presented. RTTH enables natural hand-eye coordination to guide invasive medical procedures without requiring tracking or a head-mounted device. It places a real-time virtual image of an object's cross section into its actual location, without noticeable(More)
We present a system to segment the medial edges of the vocal folds from stroboscopic video. The system has two components. The first learns a color transformation that optimally discriminates, according to the Fisher linear criterion, between the trachea and vocal folds. Using this transformation, it is able to make a coarse segmentation of vocal fold(More)
Over the past decade, we have developed an augmented reality system called the Sonic Flashlight (SF), which merges ultrasound with the operator’s vision using a half-silvered mirror and a miniature display attached to the ultrasound probe. We now add a small video camera and a structured laser light source so that computer vision algorithms can determine(More)
With modern automated microscopes and digital cameras, pathologists no longer have to examine samples looking through microscope binoculars. Instead, the slide is digitized to an image, which can then be examined on a screen. This creates the possibility for computers to analyze the image. In this work, a fully automated approach to region of interest (ROI)(More)