Felix Tang

Learn More
Wearable, camera based, head–tracking systems use spatial image registration algorithms to align images taken as the wearer gazes around their environment. This allows for computer–generated information to appear to the user as though it was anchored in the real world. Often, these algorithms require creation of a mul-tiscale Gaussian pyramid or repetitive(More)
We present a system which allows wearable computer users to share their views of their current environments with each other. Our system uses an EyeTap: a device which allows the eye of the wearer to function both as a camera and a display. A wearer, by looking around his/her environment , " paints " or " builds " an environment map composed of images from(More)
This paper is the continuation of a work presented at ICORR 07, in which we discussed the possibility of improving eye-hand coordination in children diagnosed with this problem, using a robotic mapping from a haptic user interface to a virtual environment. Our goal is to develop, implement and refine a system that will assess and improve the eye-hand(More)
Multifocal plane microscopy (MUM) has made it possible to study subcellular dynamics in 3D at high temporal and spatial resolution by simultaneously imaging distinct planes within the specimen. MUM allows high accuracy localization of a point source along the z-axis since it overcomes the depth discrimination problem of conventional single plane microscopy.(More)
" Mediated reality affords the EyeTap apparatus the ability to augment, diminish or otherwise alter our perception of reality. " [1] The goal of this research is to demonstrate interactive and shared mediated realities. The EyeTap reality mediator is used with Bluetooth, 802.11b and IrDA wireless technologies to implement a system that allows for the(More)
  • 1