Interpreting 2D gesture annotations in 3D augmented reality

@article{Nuernberger2016Interpreting2G,
  title={Interpreting 2D gesture annotations in 3D augmented reality},
  author={Benjamin Nuernberger and Kuo-Chin Lien and Tobias H{\"o}llerer and Matthew Turk},
  journal={2016 IEEE Symposium on 3D User Interfaces (3DUI)},
  year={2016},
  pages={149-158}
}
A 2D gesture annotation provides a simple way to annotate the physical world in augmented reality for a range of applications such as remote collaboration. When rendered from novel viewpoints, these annotations have previously only worked with statically positioned cameras or planar scenes. However, if the camera moves and is observing an arbitrary environment, 2D gesture annotations can easily lose their meaning when shown from novel viewpoints due to perspective effects. In this paper, we… CONTINUE READING

References

Publications referenced by this paper.
Showing 1-10 of 31 references

The Effect of View Independence in a Collaborative AR System

Computer Supported Cooperative Work (CSCW) • 2015
View 3 Excerpts
Highly Influenced

Drawing for Illustration and Annotation in 3D

Comput. Graph. Forum • 2001
View 4 Excerpts
Highly Influenced

Convexity based object partitioning for robot applications

2014 IEEE International Conference on Robotics and Automation (ICRA) • 2014
View 2 Excerpts