Learn More
Wearable, camera based, head–tracking systems use spatial image registration algorithms to align images taken as the wearer gazes around their environment. This allows for computer–generated information to appear to the user as though it was anchored in the real world. Often, these algorithms require creation of a multiscale Gaussian pyramid or repetitive(More)
Multifocal plane microscopy (MUM) has made it possible to study subcellular dynamics in 3D at high temporal and spatial resolution by simultaneously imaging distinct planes within the specimen. MUM allows high accuracy localization of a point source along the z-axis since it overcomes the depth discrimination problem of conventional single plane microscopy.(More)
We present a system which allows wearable computer users to share their views of their current environments with each other. Our system uses an EyeTap: a device which allows the eye of the wearer to function both as a camera and a display. A wearer, by looking around his/her environment, “paints” or “builds” an environment map composed of images from the(More)
“Mediated reality affords the EyeTap apparatus the ability to augment, diminish or otherwise alter our perception of reality.”[1] The goal of this research is to demonstrate interactive and shared mediated realities. The EyeTap reality mediator is used with Bluetooth, 802.11b and IrDA wireless technologies to implement a system that allows for the(More)
  • 1